WorldWideScience

Sample records for surface based computing

  1. Computer aided surface representation

    Energy Technology Data Exchange (ETDEWEB)

    Barnhill, R E

    1987-11-01

    The aims of this research are the creation of new surface forms and the determination of geometric and physical properties of surfaces. The full sweep from constructive mathematics through the implementation of algorithms and the interactive computer graphics display of surfaces is utilized. Both three-dimensional and multi- dimensional surfaces are considered. Particular emphasis is given to the scientific computing solution of Department of Energy problems. The methods that we have developed and that we are proposing to develop allow applications such as: Producing smooth contour maps from measured data, such as weather maps. Modeling the heat distribution inside a furnace from sample measurements. Terrain modeling based on satellite pictures. The investigation of new surface forms includes the topics of triangular interpolants, multivariate interpolation, surfaces defined on surfaces and monotone and/or convex surfaces. The geometric and physical properties considered include contours, the intersection of surfaces, curvatures as a interrogation tool, and numerical integration.

  2. CLINICAL SURFACES - Activity-Based Computing for Distributed Multi-Display Environments in Hospitals

    Science.gov (United States)

    Bardram, Jakob E.; Bunde-Pedersen, Jonathan; Doryab, Afsaneh; Sørensen, Steffen

    A multi-display environment (MDE) is made up of co-located and networked personal and public devices that form an integrated workspace enabling co-located group work. Traditionally, MDEs have, however, mainly been designed to support a single “smart room”, and have had little sense of the tasks and activities that the MDE is being used for. This paper presents a novel approach to support activity-based computing in distributed MDEs, where displays are physically distributed across a large building. CLINICAL SURFACES was designed for clinical work in hospitals, and enables context-sensitive retrieval and browsing of patient data on public displays. We present the design and implementation of CLINICAL SURFACES, and report from an evaluation of the system at a large hospital. The evaluation shows that using distributed public displays to support activity-based computing inside a hospital is very useful for clinical work, and that the apparent contradiction between maintaining privacy of medical data in a public display environment can be mitigated by the use of CLINICAL SURFACES.

  3. Computational Complexity of Combinatorial Surfaces

    NARCIS (Netherlands)

    Vegter, Gert; Yap, Chee K.

    1990-01-01

    We investigate the computational problems associated with combinatorial surfaces. Specifically, we present an algorithm (based on the Brahana-Dehn-Heegaard approach) for transforming the polygonal schema of a closed triangulated surface into its canonical form in O(n log n) time, where n is the

  4. Three-phase short circuit calculation method based on pre-computed surface for doubly fed induction generator

    Science.gov (United States)

    Ma, J.; Liu, Q.

    2018-02-01

    This paper presents an improved short circuit calculation method, based on pre-computed surface to determine the short circuit current of a distribution system with multiple doubly fed induction generators (DFIGs). The short circuit current, injected into power grid by DFIG, is determined by low voltage ride through (LVRT) control and protection under grid fault. However, the existing methods are difficult to calculate the short circuit current of DFIG in engineering practice due to its complexity. A short circuit calculation method, based on pre-computed surface, was proposed by developing the surface of short circuit current changing with the calculating impedance and the open circuit voltage. And the short circuit currents were derived by taking into account the rotor excitation and crowbar activation time. Finally, the pre-computed surfaces of short circuit current at different time were established, and the procedure of DFIG short circuit calculation considering its LVRT was designed. The correctness of proposed method was verified by simulation.

  5. Computer aided surface representation

    Energy Technology Data Exchange (ETDEWEB)

    Barnhill, R.E.

    1990-02-19

    The central research problem of this project is the effective representation, computation, and display of surfaces interpolating to information in three or more dimensions. If the given information is located on another surface, then the problem is to construct a surface defined on a surface''. Sometimes properties of an already defined surface are desired, which is geometry processing''. Visualization of multivariate surfaces is possible by means of contouring higher dimensional surfaces. These problems and more are discussed below. The broad sweep from constructive mathematics through computational algorithms to computer graphics illustrations is utilized in this research. The breadth and depth of this research activity makes this research project unique.

  6. A computer-aided surface roughness measurement system

    International Nuclear Information System (INIS)

    Hughes, F.J.; Schankula, M.H.

    1983-11-01

    A diamond stylus profilometer with computer-based data acquisitions/analysis system is being used to characterize surfaces of reactor components and materials, and to examine the effects of surface topography on thermal contact conductance. The current system is described; measurement problems and system development are discussed in general terms and possible future improvements are outlined

  7. Preliminary Study on Hybrid Computational Phantom for Radiation Dosimetry Based on Subdivision Surface

    International Nuclear Information System (INIS)

    Jeong, Jong Hwi; Choi, Sang Hyoun; Cho, Sung Koo; Kim, Chan Hyeong

    2007-01-01

    The anthropomorphic computational phantoms are classified into two groups. One group is the stylized phantoms, or MIRD phantoms, which are based on mathematical representations of the anatomical structures. The shapes and positions of the organs and tissues in these phantoms can be adjusted by changing the coefficients of the equations in use. The other group is the voxel phantoms, which are based on tomographic images of a real person such as CT, MR and serially sectioned color slice images from a cadaver. Obviously, the voxel phantoms represent the anatomical structures of a human body much more realistically than the stylized phantoms. A realistic representation of anatomical structure is very important for an accurate calculation of radiation dose in the human body. Consequently, the ICRP recently has decided to use the voxel phantoms for the forthcoming update of the dose conversion coefficients. However, the voxel phantoms also have some limitations: (1) The topology and dimensions of the organs and tissues in a voxel model are extremely difficult to change, and (2) The thin organs, such as oral mucosa and skin, cannot be realistically modeled unless the voxel resolution is prohibitively high. Recently, a new approach has been implemented by several investigators. The investigators converted their voxel phantoms to hybrid computational phantoms based on NURBS (Non-Uniform Rational B-Splines) surface, which is smooth and deformable. It is claimed that these new phantoms have the flexibility of the stylized phantom along with the realistic representations of the anatomical structures. The topology and dimensions of the anatomical structures can be easily changed as necessary. Thin organs can be modeled without affecting computational speed or memory requirement. The hybrid phantoms can be also used for 4-D Monte Carlo simulations. In this preliminary study, the external shape of a voxel phantom (i.e., skin), HDRK-Man, was converted to a hybrid computational

  8. A computer vision system for rapid search inspired by surface-based attention mechanisms from human perception.

    Science.gov (United States)

    Mohr, Johannes; Park, Jong-Han; Obermayer, Klaus

    2014-12-01

    Humans are highly efficient at visual search tasks by focusing selective attention on a small but relevant region of a visual scene. Recent results from biological vision suggest that surfaces of distinct physical objects form the basic units of this attentional process. The aim of this paper is to demonstrate how such surface-based attention mechanisms can speed up a computer vision system for visual search. The system uses fast perceptual grouping of depth cues to represent the visual world at the level of surfaces. This representation is stored in short-term memory and updated over time. A top-down guided attention mechanism sequentially selects one of the surfaces for detailed inspection by a recognition module. We show that the proposed attention framework requires little computational overhead (about 11 ms), but enables the system to operate in real-time and leads to a substantial increase in search efficiency. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Computational method for free surface hydrodynamics

    International Nuclear Information System (INIS)

    Hirt, C.W.; Nichols, B.D.

    1980-01-01

    There are numerous flow phenomena in pressure vessel and piping systems that involve the dynamics of free fluid surfaces. For example, fluid interfaces must be considered during the draining or filling of tanks, in the formation and collapse of vapor bubbles, and in seismically shaken vessels that are partially filled. To aid in the analysis of these types of flow phenomena, a new technique has been developed for the computation of complicated free-surface motions. This technique is based on the concept of a local average volume of fluid (VOF) and is embodied in a computer program for two-dimensional, transient fluid flow called SOLA-VOF. The basic approach used in the VOF technique is briefly described, and compared to other free-surface methods. Specific capabilities of the SOLA-VOF program are illustrated by generic examples of bubble growth and collapse, flows of immiscible fluid mixtures, and the confinement of spilled liquids

  10. Computation of Surface Integrals of Curl Vector Fields

    Science.gov (United States)

    Hu, Chenglie

    2007-01-01

    This article presents a way of computing a surface integral when the vector field of the integrand is a curl field. Presented in some advanced calculus textbooks such as [1], the technique, as the author experienced, is simple and applicable. The computation is based on Stokes' theorem in 3-space calculus, and thus provides not only a means to…

  11. The application of computational thermodynamics and a numerical model for the determination of surface tension and Gibbs-Thomson coefficient of aluminum based alloys

    International Nuclear Information System (INIS)

    Jacome, Paulo A.D.; Landim, Mariana C.; Garcia, Amauri; Furtado, Alexandre F.; Ferreira, Ivaldo L.

    2011-01-01

    Highlights: → Surface tension and the Gibbs-Thomson coefficient are computed for Al-based alloys. → Butler's scheme and ThermoCalc are used to compute the thermophysical properties. → Predictive cell/dendrite growth models depend on accurate thermophysical properties. → Mechanical properties can be related to the microstructural cell/dendrite spacing. - Abstract: In this paper, a solution for Butler's formulation is presented permitting the surface tension and the Gibbs-Thomson coefficient of Al-based binary alloys to be determined. The importance of Gibbs-Thomson coefficient for binary alloys is related to the reliability of predictions furnished by predictive cellular and dendritic growth models and of numerical computations of solidification thermal variables, which will be strongly dependent on the thermophysical properties assumed for the calculations. A numerical model based on Powell hybrid algorithm and a finite difference Jacobian approximation was coupled to a specific interface of a computational thermodynamics software in order to assess the excess Gibbs energy of the liquid phase, permitting the surface tension and Gibbs-Thomson coefficient for Al-Fe, Al-Ni, Al-Cu and Al-Si hypoeutectic alloys to be calculated. The computed results are presented as a function of the alloy composition.

  12. Free surface profiles in river flows: Can standard energy-based gradually-varied flow computations be pursued?

    Science.gov (United States)

    Cantero, Francisco; Castro-Orgaz, Oscar; Garcia-Marín, Amanda; Ayuso, José Luis; Dey, Subhasish

    2015-10-01

    Is the energy equation for gradually-varied flow the best approximation for the free surface profile computations in river flows? Determination of flood inundation in rivers and natural waterways is based on the hydraulic computation of flow profiles. This is usually done using energy-based gradually-varied flow models, like HEC-RAS, that adopts a vertical division method for discharge prediction in compound channel sections. However, this discharge prediction method is not so accurate in the context of advancements over the last three decades. This paper firstly presents a study of the impact of discharge prediction on the gradually-varied flow computations by comparing thirteen different methods for compound channels, where both energy and momentum equations are applied. The discharge, velocity distribution coefficients, specific energy, momentum and flow profiles are determined. After the study of gradually-varied flow predictions, a new theory is developed to produce higher-order energy and momentum equations for rapidly-varied flow in compound channels. These generalized equations enable to describe the flow profiles with more generality than the gradually-varied flow computations. As an outcome, results of gradually-varied flow provide realistic conclusions for computations of flow in compound channels, showing that momentum-based models are in general more accurate; whereas the new theory developed for rapidly-varied flow opens a new research direction, so far not investigated in flows through compound channels.

  13. Protein consensus-based surface engineering (ProCoS): a computer-assisted method for directed protein evolution.

    Science.gov (United States)

    Shivange, Amol V; Hoeffken, Hans Wolfgang; Haefner, Stefan; Schwaneberg, Ulrich

    2016-12-01

    Protein consensus-based surface engineering (ProCoS) is a simple and efficient method for directed protein evolution combining computational analysis and molecular biology tools to engineer protein surfaces. ProCoS is based on the hypothesis that conserved residues originated from a common ancestor and that these residues are crucial for the function of a protein, whereas highly variable regions (situated on the surface of a protein) can be targeted for surface engineering to maximize performance. ProCoS comprises four main steps: ( i ) identification of conserved and highly variable regions; ( ii ) protein sequence design by substituting residues in the highly variable regions, and gene synthesis; ( iii ) in vitro DNA recombination of synthetic genes; and ( iv ) screening for active variants. ProCoS is a simple method for surface mutagenesis in which multiple sequence alignment is used for selection of surface residues based on a structural model. To demonstrate the technique's utility for directed evolution, the surface of a phytase enzyme from Yersinia mollaretii (Ymphytase) was subjected to ProCoS. Screening just 1050 clones from ProCoS engineering-guided mutant libraries yielded an enzyme with 34 amino acid substitutions. The surface-engineered Ymphytase exhibited 3.8-fold higher pH stability (at pH 2.8 for 3 h) and retained 40% of the enzyme's specific activity (400 U/mg) compared with the wild-type Ymphytase. The pH stability might be attributed to a significantly increased (20 percentage points; from 9% to 29%) number of negatively charged amino acids on the surface of the engineered phytase.

  14. The application of computational thermodynamics and a numerical model for the determination of surface tension and Gibbs-Thomson coefficient of aluminum based alloys

    Energy Technology Data Exchange (ETDEWEB)

    Jacome, Paulo A.D.; Landim, Mariana C. [Department of Mechanical Engineering, Fluminense Federal University, Av. dos Trabalhadores, 420-27255-125 Volta Redonda, RJ (Brazil); Garcia, Amauri, E-mail: amaurig@fem.unicamp.br [Department of Materials Engineering, University of Campinas, UNICAMP, PO Box 6122, 13083-970 Campinas, SP (Brazil); Furtado, Alexandre F.; Ferreira, Ivaldo L. [Department of Mechanical Engineering, Fluminense Federal University, Av. dos Trabalhadores, 420-27255-125 Volta Redonda, RJ (Brazil)

    2011-08-20

    Highlights: {yields} Surface tension and the Gibbs-Thomson coefficient are computed for Al-based alloys. {yields} Butler's scheme and ThermoCalc are used to compute the thermophysical properties. {yields} Predictive cell/dendrite growth models depend on accurate thermophysical properties. {yields} Mechanical properties can be related to the microstructural cell/dendrite spacing. - Abstract: In this paper, a solution for Butler's formulation is presented permitting the surface tension and the Gibbs-Thomson coefficient of Al-based binary alloys to be determined. The importance of Gibbs-Thomson coefficient for binary alloys is related to the reliability of predictions furnished by predictive cellular and dendritic growth models and of numerical computations of solidification thermal variables, which will be strongly dependent on the thermophysical properties assumed for the calculations. A numerical model based on Powell hybrid algorithm and a finite difference Jacobian approximation was coupled to a specific interface of a computational thermodynamics software in order to assess the excess Gibbs energy of the liquid phase, permitting the surface tension and Gibbs-Thomson coefficient for Al-Fe, Al-Ni, Al-Cu and Al-Si hypoeutectic alloys to be calculated. The computed results are presented as a function of the alloy composition.

  15. Simulation of Specular Surface Imaging Based on Computer Graphics: Application on a Vision Inspection System

    Directory of Open Access Journals (Sweden)

    Seulin Ralph

    2002-01-01

    Full Text Available This work aims at detecting surface defects on reflecting industrial parts. A machine vision system, performing the detection of geometric aspect surface defects, is completely described. The revealing of defects is realized by a particular lighting device. It has been carefully designed to ensure the imaging of defects. The lighting system simplifies a lot the image processing for defect segmentation and so a real-time inspection of reflective products is possible. To bring help in the conception of imaging conditions, a complete simulation is proposed. The simulation, based on computer graphics, enables the rendering of realistic images. Simulation provides here a very efficient way to perform tests compared to the numerous attempts of manual experiments.

  16. Computer representation of molecular surfaces

    International Nuclear Information System (INIS)

    Max, N.L.

    1981-01-01

    This review article surveys recent work on computer representation of molecular surfaces. Several different algorithms are discussed for producing vector or raster drawings of space-filling models formed as the union of spheres. Other smoother surfaces are also considered

  17. Efficient 3D geometric and Zernike moments computation from unstructured surface meshes.

    Science.gov (United States)

    Pozo, José María; Villa-Uriol, Maria-Cruz; Frangi, Alejandro F

    2011-03-01

    This paper introduces and evaluates a fast exact algorithm and a series of faster approximate algorithms for the computation of 3D geometric moments from an unstructured surface mesh of triangles. Being based on the object surface reduces the computational complexity of these algorithms with respect to volumetric grid-based algorithms. In contrast, it can only be applied for the computation of geometric moments of homogeneous objects. This advantage and restriction is shared with other proposed algorithms based on the object boundary. The proposed exact algorithm reduces the computational complexity for computing geometric moments up to order N with respect to previously proposed exact algorithms, from N(9) to N(6). The approximate series algorithm appears as a power series on the rate between triangle size and object size, which can be truncated at any desired degree. The higher the number and quality of the triangles, the better the approximation. This approximate algorithm reduces the computational complexity to N(3). In addition, the paper introduces a fast algorithm for the computation of 3D Zernike moments from the computed geometric moments, with a computational complexity N(4), while the previously proposed algorithm is of order N(6). The error introduced by the proposed approximate algorithms is evaluated in different shapes and the cost-benefit ratio in terms of error, and computational time is analyzed for different moment orders.

  18. Surface computing and collaborative analysis work

    CERN Document Server

    Brown, Judith; Gossage, Stevenson; Hack, Chris

    2013-01-01

    Large surface computing devices (wall-mounted or tabletop) with touch interfaces and their application to collaborative data analysis, an increasingly important and prevalent activity, is the primary topic of this book. Our goals are to outline the fundamentals of surface computing (a still maturing technology), review relevant work on collaborative data analysis, describe frameworks for understanding collaborative processes, and provide a better understanding of the opportunities for research and development. We describe surfaces as display technologies with which people can interact directly, and emphasize how interaction design changes when designing for large surfaces. We review efforts to use large displays, surfaces or mixed display environments to enable collaborative analytic activity. Collaborative analysis is important in many domains, but to provide concrete examples and a specific focus, we frequently consider analysis work in the security domain, and in particular the challenges security personne...

  19. A GPU-based mipmapping method for water surface visualization

    Science.gov (United States)

    Li, Hua; Quan, Wei; Xu, Chao; Wu, Yan

    2018-03-01

    Visualization of water surface is a hot topic in computer graphics. In this paper, we presented a fast method to generate wide range of water surface with good image quality both near and far from the viewpoint. This method utilized uniform mesh and Fractal Perlin noise to model water surface. Mipmapping technology was enforced to the surface textures, which adjust the resolution with respect to the distance from the viewpoint and reduce the computing cost. Lighting effect was computed based on shadow mapping technology, Snell's law and Fresnel term. The render pipeline utilizes a CPU-GPU shared memory structure, which improves the rendering efficiency. Experiment results show that our approach visualizes water surface with good image quality at real-time frame rates performance.

  20. Integrating aerodynamic surface modeling for computational fluid dynamics with computer aided structural analysis, design, and manufacturing

    Science.gov (United States)

    Thorp, Scott A.

    1992-01-01

    This presentation will discuss the development of a NASA Geometry Exchange Specification for transferring aerodynamic surface geometry between LeRC systems and grid generation software used for computational fluid dynamics research. The proposed specification is based on a subset of the Initial Graphics Exchange Specification (IGES). The presentation will include discussion of how the NASA-IGES standard will accommodate improved computer aided design inspection methods and reverse engineering techniques currently being developed. The presentation is in viewgraph format.

  1. Computer simulation of the topography evolution on ion bombarded surfaces

    CERN Document Server

    Zier, M

    2003-01-01

    The development of roughness on ion bombarded surfaces (facets, ripples) on single crystalline and amorphous homogeneous solids plays an important role for example in depth profiling techniques. To verify a faceting mechanism based not only on sputtering by directly impinging ions but also on the contribution of reflected ions and the redeposition of sputtered material a computer simulation has been carried out. The surface in this model is treated as a two-dimensional line segment profile. The model describes the topography evolution on ion bombarded surfaces including the growth mechanism of a facetted surface, using only the interplay of reflected and primary ions and redeposited atoms.

  2. NURBS-based 3-d anthropomorphic computational phantoms for radiation dosimetry applications

    International Nuclear Information System (INIS)

    Lee, Choonsik; Lodwick, Daniel; Lee, Choonik; Bolch, Wesley E.

    2007-01-01

    Computational anthropomorphic phantoms are computer models used in the evaluation of absorbed dose distributions within the human body. Currently, two classes of the computational phantoms have been developed and widely utilised for dosimetry calculation: (1) stylized (equation-based) and (2) voxel (image-based) phantoms describing human anatomy through the use of mathematical surface equations and 3-D voxel matrices, respectively. However, stylized phantoms have limitations in defining realistic organ contours and positioning as compared to voxel phantoms, which are themselves based on medical images of human subjects. In turn, voxel phantoms that have been developed through medical image segmentation have limitations in describing organs that are presented in low contrast within either magnetic resonance or computed tomography image. The present paper reviews the advantages and disadvantages of these existing classes of computational phantoms and introduces a hybrid approach to a computational phantom construction based on non-uniform rational B-Spline (NURBS) surface animation technology that takes advantage of the most desirable features of the former two phantom types. (authors)

  3. The Actuator Surface Model: A New Navier-Stokes Based Model for Rotor Computations

    DEFF Research Database (Denmark)

    Shen, Wen Zhong; Zhang, J.H.; Sørensen, Jens Nørkær

    2009-01-01

    This paper presents a new numerical technique for simulating two-dimensional wind turbine flow. The method, denoted as the 2D actuator surface technique, consists of a two-dimensional Navier-Stokes solver in which the pressure distribution is represented by body forces that are distributed along ....... In the last part, the actuator surface technique is applied to compute the flow past a two-bladed vertical axis wind turbine equipped with NACA 0012 airfoils. Comparisons with experimental data show an encouraging performance of the method.......This paper presents a new numerical technique for simulating two-dimensional wind turbine flow. The method, denoted as the 2D actuator surface technique, consists of a two-dimensional Navier-Stokes solver in which the pressure distribution is represented by body forces that are distributed along...

  4. Computational efficiency for the surface renewal method

    Science.gov (United States)

    Kelley, Jason; Higgins, Chad

    2018-04-01

    Measuring surface fluxes using the surface renewal (SR) method requires programmatic algorithms for tabulation, algebraic calculation, and data quality control. A number of different methods have been published describing automated calibration of SR parameters. Because the SR method utilizes high-frequency (10 Hz+) measurements, some steps in the flux calculation are computationally expensive, especially when automating SR to perform many iterations of these calculations. Several new algorithms were written that perform the required calculations more efficiently and rapidly, and that tested for sensitivity to length of flux averaging period, ability to measure over a large range of lag timescales, and overall computational efficiency. These algorithms utilize signal processing techniques and algebraic simplifications that demonstrate simple modifications that dramatically improve computational efficiency. The results here complement efforts by other authors to standardize a robust and accurate computational SR method. Increased speed of computation time grants flexibility to implementing the SR method, opening new avenues for SR to be used in research, for applied monitoring, and in novel field deployments.

  5. Structure analysis of Pd(1 1 0) surface by computer simulation of NAACO's

    International Nuclear Information System (INIS)

    Takeuchi, Wataru; Yamamura, Yasunori

    2001-01-01

    The relaxation at a Pd(1 1 0) surface has been estimated using the computer simulation of 165 deg. neutral impact-collision ion scattering spectroscopy (NICISS). The computer simulations employing ACOCT program code treated three-dimensionally the atomic collisions and based on the binary collision approximation (BCA) were performed for the case of 2.08 keV Ne + ions incident along the [1 1-bar 2] azimuth of the Pd(1 1 0) surface. The experimental results of Speller et al. (Surf. Sci. 383 (1997) 131) are well reproduced by the ACOCT simulations including the inward relaxation of 14% of the first interlayer spacing and including the vertical component of surface Debye temperature of 125 K

  6. Holonomic surface codes for fault-tolerant quantum computation

    Science.gov (United States)

    Zhang, Jiang; Devitt, Simon J.; You, J. Q.; Nori, Franco

    2018-02-01

    Surface codes can protect quantum information stored in qubits from local errors as long as the per-operation error rate is below a certain threshold. Here we propose holonomic surface codes by harnessing the quantum holonomy of the system. In our scheme, the holonomic gates are built via auxiliary qubits rather than the auxiliary levels in multilevel systems used in conventional holonomic quantum computation. The key advantage of our approach is that the auxiliary qubits are in their ground state before and after each gate operation, so they are not involved in the operation cycles of surface codes. This provides an advantageous way to implement surface codes for fault-tolerant quantum computation.

  7. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    Science.gov (United States)

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  8. Silicon CMOS architecture for a spin-based quantum computer.

    Science.gov (United States)

    Veldhorst, M; Eenink, H G J; Yang, C H; Dzurak, A S

    2017-12-15

    Recent advances in quantum error correction codes for fault-tolerant quantum computing and physical realizations of high-fidelity qubits in multiple platforms give promise for the construction of a quantum computer based on millions of interacting qubits. However, the classical-quantum interface remains a nascent field of exploration. Here, we propose an architecture for a silicon-based quantum computer processor based on complementary metal-oxide-semiconductor (CMOS) technology. We show how a transistor-based control circuit together with charge-storage electrodes can be used to operate a dense and scalable two-dimensional qubit system. The qubits are defined by the spin state of a single electron confined in quantum dots, coupled via exchange interactions, controlled using a microwave cavity, and measured via gate-based dispersive readout. We implement a spin qubit surface code, showing the prospects for universal quantum computation. We discuss the challenges and focus areas that need to be addressed, providing a path for large-scale quantum computing.

  9. Computational approach to Riemann surfaces

    CERN Document Server

    Klein, Christian

    2011-01-01

    This volume offers a well-structured overview of existent computational approaches to Riemann surfaces and those currently in development. The authors of the contributions represent the groups providing publically available numerical codes in this field. Thus this volume illustrates which software tools are available and how they can be used in practice. In addition examples for solutions to partial differential equations and in surface theory are presented. The intended audience of this book is twofold. It can be used as a textbook for a graduate course in numerics of Riemann surfaces, in which case the standard undergraduate background, i.e., calculus and linear algebra, is required. In particular, no knowledge of the theory of Riemann surfaces is expected; the necessary background in this theory is contained in the Introduction chapter. At the same time, this book is also intended for specialists in geometry and mathematical physics applying the theory of Riemann surfaces in their research. It is the first...

  10. Accurate measurement of surface areas of anatomical structures by computer-assisted triangulation of computed tomography images

    Energy Technology Data Exchange (ETDEWEB)

    Allardice, J.T.; Jacomb-Hood, J.; Abulafi, A.M.; Williams, N.S. (Royal London Hospital (United Kingdom)); Cookson, J.; Dykes, E.; Holman, J. (London Hospital Medical College (United Kingdom))

    1993-05-01

    There is a need for accurate surface area measurement of internal anatomical structures in order to define light dosimetry in adjunctive intraoperative photodynamic therapy (AIOPDT). The authors investigated whether computer-assisted triangulation of serial sections generated by computed tomography (CT) scanning can give an accurate assessment of the surface area of the walls of the true pelvis after anterior resection and before colorectal anastomosis. They show that the technique of paper density tessellation is an acceptable method of measuring the surface areas of phantom objects, with a maximum error of 0.5%, and is used as the gold standard. Computer-assisted triangulation of CT images of standard geometric objects and accurately-constructed pelvic phantoms gives a surface area assessment with a maximum error of 2.5% compared with the gold standard. The CT images of 20 patients' pelves have been analysed by computer-assisted triangulation and this shows the surface area of the walls varies from 143 cm[sup 2] to 392 cm[sup 2]. (Author).

  11. VASCo: computation and visualization of annotated protein surface contacts

    Directory of Open Access Journals (Sweden)

    Thallinger Gerhard G

    2009-01-01

    Full Text Available Abstract Background Structural data from crystallographic analyses contain a vast amount of information on protein-protein contacts. Knowledge on protein-protein interactions is essential for understanding many processes in living cells. The methods to investigate these interactions range from genetics to biophysics, crystallography, bioinformatics and computer modeling. Also crystal contact information can be useful to understand biologically relevant protein oligomerisation as they rely in principle on the same physico-chemical interaction forces. Visualization of crystal and biological contact data including different surface properties can help to analyse protein-protein interactions. Results VASCo is a program package for the calculation of protein surface properties and the visualization of annotated surfaces. Special emphasis is laid on protein-protein interactions, which are calculated based on surface point distances. The same approach is used to compare surfaces of two aligned molecules. Molecular properties such as electrostatic potential or hydrophobicity are mapped onto these surface points. Molecular surfaces and the corresponding properties are calculated using well established programs integrated into the package, as well as using custom developed programs. The modular package can easily be extended to include new properties for annotation. The output of the program is most conveniently displayed in PyMOL using a custom-made plug-in. Conclusion VASCo supplements other available protein contact visualisation tools and provides additional information on biological interactions as well as on crystal contacts. The tool provides a unique feature to compare surfaces of two aligned molecules based on point distances and thereby facilitates the visualization and analysis of surface differences.

  12. Transient Convection, Diffusion, and Adsorption in Surface-Based Biosensors

    DEFF Research Database (Denmark)

    Hansen, Rasmus; Bruus, Henrik; Callisen, Thomas H.

    2012-01-01

    This paper presents a theoretical and computational investigation of convection, diffusion, and adsorption in surface-based biosensors. In particular, we study the transport dynamics in a model geometry of a surface plasmon resonance (SPR) sensor. The work, however, is equally relevant for other...... microfluidic surface-based biosensors, operating under flow conditions. A widely adopted approximate quasi-steady theory to capture convective and diffusive mass transport is reviewed, and an analytical solution is presented. An expression of the Damköhler number is derived in terms of the nondimensional...... concentration to the maximum surface capacity is critical for reliable use of the quasi-steady theory. Finally, our results provide users of surface-based biosensors with a tool for correcting experimentally obtained adsorption rate constants....

  13. Computer studies of surface structure of NiAl(111)

    International Nuclear Information System (INIS)

    Takeuchi, Wataru; Yamamura, Yasunori

    1994-01-01

    The 180 neutral impact-collision ion scattering spectroscopy (NICISS) data have been analyzed using the ACOCT program code based on the binary collision approximation (BCA). The computer simulations are performed for the case of 2 keV He + ions incident along the [ anti 12 anti 1] direction of a NiAl(111) surface. It is found that the experimental results are well reproduced by the ACOCT simulations including the inward relaxation of 40% of the first interlayer spacing on Ni terminated layer at the NiAl(111) surface and including the Moliere approximation of the Thomas-Fermi potential with a reduced Firsov screening length, multiplied by a factor of 0.60. (orig.)

  14. Applying tensor-based morphometry to parametric surfaces can improve MRI-based disease diagnosis.

    Science.gov (United States)

    Wang, Yalin; Yuan, Lei; Shi, Jie; Greve, Alexander; Ye, Jieping; Toga, Arthur W; Reiss, Allan L; Thompson, Paul M

    2013-07-01

    Many methods have been proposed for computer-assisted diagnostic classification. Full tensor information and machine learning with 3D maps derived from brain images may help detect subtle differences or classify subjects into different groups. Here we develop a new approach to apply tensor-based morphometry to parametric surface models for diagnostic classification. We use this approach to identify cortical surface features for use in diagnostic classifiers. First, with holomorphic 1-forms, we compute an efficient and accurate conformal mapping from a multiply connected mesh to the so-called slit domain. Next, the surface parameterization approach provides a natural way to register anatomical surfaces across subjects using a constrained harmonic map. To analyze anatomical differences, we then analyze the full Riemannian surface metric tensors, which retain multivariate information on local surface geometry. As the number of voxels in a 3D image is large, sparse learning is a promising method to select a subset of imaging features and to improve classification accuracy. Focusing on vertices with greatest effect sizes, we train a diagnostic classifier using the surface features selected by an L1-norm based sparse learning method. Stability selection is applied to validate the selected feature sets. We tested the algorithm on MRI-derived cortical surfaces from 42 subjects with genetically confirmed Williams syndrome and 40 age-matched controls, multivariate statistics on the local tensors gave greater effect sizes for detecting group differences relative to other TBM-based statistics including analysis of the Jacobian determinant and the largest eigenvalue of the surface metric. Our method also gave reasonable classification results relative to the Jacobian determinant, the pair of eigenvalues of the Jacobian matrix and volume features. This analysis pipeline may boost the power of morphometry studies, and may assist with image-based classification. Copyright © 2013

  15. Isolation of the ocular surface to treat dysfunctional tear syndrome associated with computer use.

    Science.gov (United States)

    Yee, Richard W; Sperling, Harry G; Kattek, Ashballa; Paukert, Martin T; Dawson, Kevin; Garcia, Marcie; Hilsenbeck, Susan

    2007-10-01

    Dysfunctional tear syndrome (DTS) associated with computer use is characterized by mild irritation, itching, redness, and intermittent tearing after extended staring. It frequently involves foreign body or sandy sensation, blurring of vision, and fatigue, worsening especially at the end of the day. We undertook a study to determine the effectiveness of periocular isolation using microenvironment glasses (MEGS) alone and in combination with artificial tears in alleviating the symptoms and signs of dry eye related to computer use. At the same time, we evaluated the relative ability of a battery of clinical tests for dry eye to distinguish dry eyes from normal eyes in heavy computer users. Forty adult subjects who used computers 3 hours or more per day were divided into dry eye sufferers and controls based on their scores on the Ocular Surface Disease Index (OSDI). Baseline scores were recorded and ocular surface assessments were made. On four subsequent visits, the subjects played a computer game for 30 minutes in a controlled environment, during which one of four treatment conditions were applied, in random order, to each subject: 1) no treatment, 2) artificial tears, 3) MEGS, and 4) artificial tears combined with MEGS. Immediately after each session, subjects were tested on: a subjective comfort questionnaire, tear breakup time (TBUT), fluorescein staining, lissamine green staining, and conjunctival injection. In this study, a significant correlation was found between cumulative lifetime computer use and ocular surface disorder, as measured by the standardized OSDI index. The experimental and control subjects were significantly different (P0.05. Isolation of the ocular surface alone produced significant improvements in comfort scores and TBUT and a consistent trend of improvement in fluorescein staining and lissamine green staining. Isolation plus tears produced a significant improvement in lissamine green staining. The subjective comfort inventory and the TBUT

  16. MLSOIL and DFSOIL - computer codes to estimate effective ground surface concentrations for dose computations

    International Nuclear Information System (INIS)

    Sjoreen, A.L.; Kocher, D.C.; Killough, G.G.; Miller, C.W.

    1984-11-01

    This report is a user's manual for MLSOIL (Multiple Layer SOIL model) and DFSOIL (Dose Factors for MLSOIL) and a documentation of the computational methods used in those two computer codes. MLSOIL calculates an effective ground surface concentration to be used in computations of external doses. This effective ground surface concentration is equal to (the computed dose in air from the concentration in the soil layers)/(the dose factor for computing dose in air from a plane). MLSOIL implements a five compartment linear-transfer model to calculate the concentrations of radionuclides in the soil following deposition on the ground surface from the atmosphere. The model considers leaching through the soil as well as radioactive decay and buildup. The element-specific transfer coefficients used in this model are a function of the k/sub d/ and environmental parameters. DFSOIL calculates the dose in air per unit concentration at 1 m above the ground from each of the five soil layers used in MLSOIL and the dose per unit concentration from an infinite plane source. MLSOIL and DFSOIL have been written to be part of the Computerized Radiological Risk Investigation System (CRRIS) which is designed for assessments of the health effects of airborne releases of radionuclides. 31 references, 3 figures, 4 tables

  17. Validation of phalanx bone three-dimensional surface segmentation from computed tomography images using laser scanning

    International Nuclear Information System (INIS)

    DeVries, Nicole A.; Gassman, Esther E.; Kallemeyn, Nicole A.; Shivanna, Kiran H.; Magnotta, Vincent A.; Grosland, Nicole M.

    2008-01-01

    To examine the validity of manually defined bony regions of interest from computed tomography (CT) scans. Segmentation measurements were performed on the coronal reformatted CT images of the three phalanx bones of the index finger from five cadaveric specimens. Two smoothing algorithms (image-based and Laplacian surface-based) were evaluated to determine their ability to represent accurately the anatomic surface. The resulting surfaces were compared with laser surface scans of the corresponding cadaveric specimen. The average relative overlap between two tracers was 0.91 for all bones. The overall mean difference between the manual unsmoothed surface and the laser surface scan was 0.20 mm. Both image-based and Laplacian surface-based smoothing were compared; the overall mean difference for image-based smoothing was 0.21 mm and 0.20 mm for Laplacian smoothing. This study showed that manual segmentation of high-contrast, coronal, reformatted, CT datasets can accurately represent the true surface geometry of bones. Additionally, smoothing techniques did not significantly alter the surface representations. This validation technique should be extended to other bones, image segmentation and spatial filtering techniques. (orig.)

  18. Validation of phalanx bone three-dimensional surface segmentation from computed tomography images using laser scanning

    Energy Technology Data Exchange (ETDEWEB)

    DeVries, Nicole A.; Gassman, Esther E.; Kallemeyn, Nicole A. [The University of Iowa, Department of Biomedical Engineering, Center for Computer Aided Design, Iowa City, IA (United States); Shivanna, Kiran H. [The University of Iowa, Center for Computer Aided Design, Iowa City, IA (United States); Magnotta, Vincent A. [The University of Iowa, Department of Biomedical Engineering, Department of Radiology, Center for Computer Aided Design, Iowa City, IA (United States); Grosland, Nicole M. [The University of Iowa, Department of Biomedical Engineering, Department of Orthopaedics and Rehabilitation, Center for Computer Aided Design, Iowa City, IA (United States)

    2008-01-15

    To examine the validity of manually defined bony regions of interest from computed tomography (CT) scans. Segmentation measurements were performed on the coronal reformatted CT images of the three phalanx bones of the index finger from five cadaveric specimens. Two smoothing algorithms (image-based and Laplacian surface-based) were evaluated to determine their ability to represent accurately the anatomic surface. The resulting surfaces were compared with laser surface scans of the corresponding cadaveric specimen. The average relative overlap between two tracers was 0.91 for all bones. The overall mean difference between the manual unsmoothed surface and the laser surface scan was 0.20 mm. Both image-based and Laplacian surface-based smoothing were compared; the overall mean difference for image-based smoothing was 0.21 mm and 0.20 mm for Laplacian smoothing. This study showed that manual segmentation of high-contrast, coronal, reformatted, CT datasets can accurately represent the true surface geometry of bones. Additionally, smoothing techniques did not significantly alter the surface representations. This validation technique should be extended to other bones, image segmentation and spatial filtering techniques. (orig.)

  19. Computer aided surface representation. Progress report, June 1, 1989--May 31, 1990

    Energy Technology Data Exchange (ETDEWEB)

    Barnhill, R.E.

    1990-02-19

    The central research problem of this project is the effective representation, computation, and display of surfaces interpolating to information in three or more dimensions. If the given information is located on another surface, then the problem is to construct a ``surface defined on a surface``. Sometimes properties of an already defined surface are desired, which is ``geometry processing``. Visualization of multivariate surfaces is possible by means of contouring higher dimensional surfaces. These problems and more are discussed below. The broad sweep from constructive mathematics through computational algorithms to computer graphics illustrations is utilized in this research. The breadth and depth of this research activity makes this research project unique.

  20. Reliability-based design optimization via high order response surface method

    International Nuclear Information System (INIS)

    Li, Hong Shuang

    2013-01-01

    To reduce the computational effort of reliability-based design optimization (RBDO), the response surface method (RSM) has been widely used to evaluate reliability constraints. We propose an efficient methodology for solving RBDO problems based on an improved high order response surface method (HORSM) that takes advantage of an efficient sampling method, Hermite polynomials and uncertainty contribution concept to construct a high order response surface function with cross terms for reliability analysis. The sampling method generates supporting points from Gauss-Hermite quadrature points, which can be used to approximate response surface function without cross terms, to identify the highest order of each random variable and to determine the significant variables connected with point estimate method. The cross terms between two significant random variables are added to the response surface function to improve the approximation accuracy. Integrating the nested strategy, the improved HORSM is explored in solving RBDO problems. Additionally, a sampling based reliability sensitivity analysis method is employed to reduce the computational effort further when design variables are distributional parameters of input random variables. The proposed methodology is applied on two test problems to validate its accuracy and efficiency. The proposed methodology is more efficient than first order reliability method based RBDO and Monte Carlo simulation based RBDO, and enables the use of RBDO as a practical design tool.

  1. Multiresolution Computation of Conformal Structures of Surfaces

    Directory of Open Access Journals (Sweden)

    Xianfeng Gu

    2003-10-01

    Full Text Available An efficient multiresolution method to compute global conformal structures of nonzero genus triangle meshes is introduced. The homology, cohomology groups of meshes are computed explicitly, then a basis of harmonic one forms and a basis of holomorphic one forms are constructed. A progressive mesh is generated to represent the original surface at different resolutions. The conformal structure is computed for the coarse level first, then used as the estimation for that of the finer level, by using conjugate gradient method it can be refined to the conformal structure of the finer level.

  2. A surface capturing method for the efficient computation of steady water waves

    NARCIS (Netherlands)

    Wackers, J.; Koren, B.

    2008-01-01

    A surface capturing method is developed for the computation of steady water–air flow with gravity. Fluxes are based on artificial compressibility and the method is solved with a multigrid technique and line Gauss–Seidel smoother. A test on a channel flow with a bottom bump shows the accuracy of the

  3. Conformal-Based Surface Morphing and Multi-Scale Representation

    Directory of Open Access Journals (Sweden)

    Ka Chun Lam

    2014-05-01

    Full Text Available This paper presents two algorithms, based on conformal geometry, for the multi-scale representations of geometric shapes and surface morphing. A multi-scale surface representation aims to describe a 3D shape at different levels of geometric detail, which allows analyzing or editing surfaces at the global or local scales effectively. Surface morphing refers to the process of interpolating between two geometric shapes, which has been widely applied to estimate or analyze deformations in computer graphics, computer vision and medical imaging. In this work, we propose two geometric models for surface morphing and multi-scale representation for 3D surfaces. The basic idea is to represent a 3D surface by its mean curvature function, H, and conformal factor function λ, which uniquely determine the geometry of the surface according to Riemann surface theory. Once we have the (λ, H parameterization of the surface, post-processing of the surface can be done directly on the conformal parameter domain. In particular, the problem of multi-scale representations of shapes can be reduced to the signal filtering on the λ and H parameters. On the other hand, the surface morphing problem can be transformed to an interpolation process of two sets of (λ, H parameters. We test the proposed algorithms on 3D human face data and MRI-derived brain surfaces. Experimental results show that our proposed methods can effectively obtain multi-scale surface representations and give natural surface morphing results.

  4. A surface code quantum computer in silicon

    Science.gov (United States)

    Hill, Charles D.; Peretz, Eldad; Hile, Samuel J.; House, Matthew G.; Fuechsle, Martin; Rogge, Sven; Simmons, Michelle Y.; Hollenberg, Lloyd C. L.

    2015-01-01

    The exceptionally long quantum coherence times of phosphorus donor nuclear spin qubits in silicon, coupled with the proven scalability of silicon-based nano-electronics, make them attractive candidates for large-scale quantum computing. However, the high threshold of topological quantum error correction can only be captured in a two-dimensional array of qubits operating synchronously and in parallel—posing formidable fabrication and control challenges. We present an architecture that addresses these problems through a novel shared-control paradigm that is particularly suited to the natural uniformity of the phosphorus donor nuclear spin qubit states and electronic confinement. The architecture comprises a two-dimensional lattice of donor qubits sandwiched between two vertically separated control layers forming a mutually perpendicular crisscross gate array. Shared-control lines facilitate loading/unloading of single electrons to specific donors, thereby activating multiple qubits in parallel across the array on which the required operations for surface code quantum error correction are carried out by global spin control. The complexities of independent qubit control, wave function engineering, and ad hoc quantum interconnects are explicitly avoided. With many of the basic elements of fabrication and control based on demonstrated techniques and with simulated quantum operation below the surface code error threshold, the architecture represents a new pathway for large-scale quantum information processing in silicon and potentially in other qubit systems where uniformity can be exploited. PMID:26601310

  5. A surface code quantum computer in silicon.

    Science.gov (United States)

    Hill, Charles D; Peretz, Eldad; Hile, Samuel J; House, Matthew G; Fuechsle, Martin; Rogge, Sven; Simmons, Michelle Y; Hollenberg, Lloyd C L

    2015-10-01

    The exceptionally long quantum coherence times of phosphorus donor nuclear spin qubits in silicon, coupled with the proven scalability of silicon-based nano-electronics, make them attractive candidates for large-scale quantum computing. However, the high threshold of topological quantum error correction can only be captured in a two-dimensional array of qubits operating synchronously and in parallel-posing formidable fabrication and control challenges. We present an architecture that addresses these problems through a novel shared-control paradigm that is particularly suited to the natural uniformity of the phosphorus donor nuclear spin qubit states and electronic confinement. The architecture comprises a two-dimensional lattice of donor qubits sandwiched between two vertically separated control layers forming a mutually perpendicular crisscross gate array. Shared-control lines facilitate loading/unloading of single electrons to specific donors, thereby activating multiple qubits in parallel across the array on which the required operations for surface code quantum error correction are carried out by global spin control. The complexities of independent qubit control, wave function engineering, and ad hoc quantum interconnects are explicitly avoided. With many of the basic elements of fabrication and control based on demonstrated techniques and with simulated quantum operation below the surface code error threshold, the architecture represents a new pathway for large-scale quantum information processing in silicon and potentially in other qubit systems where uniformity can be exploited.

  6. General surface reconstruction for cone-beam multislice spiral computed tomography

    International Nuclear Information System (INIS)

    Chen Laigao; Liang Yun; Heuscher, Dominic J.

    2003-01-01

    A new family of cone-beam reconstruction algorithm, the General Surface Reconstruction (GSR), is proposed and formulated in this paper for multislice spiral computed tomography (CT) reconstructions. It provides a general framework to allow the reconstruction of planar or nonplanar surfaces on a set of rebinned short-scan parallel beam projection data. An iterative surface formation method is proposed as an example to show the possibility to form nonplanar reconstruction surfaces to minimize the adverse effect between the collected cone-beam projection data and the reconstruction surfaces. The improvement in accuracy of the nonplanar surfaces over planar surfaces in the two-dimensional approximate cone-beam reconstructions is mathematically proved and demonstrated using numerical simulations. The proposed GSR algorithm is evaluated by the computer simulation of cone-beam spiral scanning geometry and various mathematical phantoms. The results demonstrate that the GSR algorithm generates much better image quality compared to conventional multislice reconstruction algorithms. For a table speed up to 100 mm per rotation, GSR demonstrates good image quality for both the low-contrast ball phantom and thorax phantom. All other performance parameters are comparable to the single-slice 180 deg. LI (linear interpolation) algorithm, which is considered the 'gold standard'. GSR also achieves high computing efficiency and good temporal resolution, making it an attractive alternative for the reconstruction of next generation multislice spiral CT data

  7. A radiosity-based model to compute the radiation transfer of soil surface

    Science.gov (United States)

    Zhao, Feng; Li, Yuguang

    2011-11-01

    A good understanding of interactions of electromagnetic radiation with soil surface is important for a further improvement of remote sensing methods. In this paper, a radiosity-based analytical model for soil Directional Reflectance Factor's (DRF) distributions was developed and evaluated. The model was specifically dedicated to the study of radiation transfer for the soil surface under tillage practices. The soil was abstracted as two dimensional U-shaped or V-shaped geometric structures with periodic macroscopic variations. The roughness of the simulated surfaces was expressed as a ratio of the height to the width for the U and V-shaped structures. The assumption was made that the shadowing of soil surface, simulated by U or V-shaped grooves, has a greater influence on the soil reflectance distribution than the scattering properties of basic soil particles of silt and clay. Another assumption was that the soil is a perfectly diffuse reflector at a microscopic level, which is a prerequisite for the application of the radiosity method. This radiosity-based analytical model was evaluated by a forward Monte Carlo ray-tracing model under the same structural scenes and identical spectral parameters. The statistics of these two models' BRF fitting results for several soil structures under the same conditions showed the good agreements. By using the model, the physical mechanism of the soil bidirectional reflectance pattern was revealed.

  8. Near-Surface Seismic Velocity Data: A Computer Program For ...

    African Journals Online (AJOL)

    A computer program (NESURVELANA) has been developed in Visual Basic Computer programming language to carry out a near surface velocity analysis. The method of analysis used includes: Algorithms design and Visual Basic codes generation for plotting arrival time (ms) against geophone depth (m) employing the ...

  9. Use of computational modeling in preparation and evaluation of surface imprinted xerogels for binding tetracycline

    International Nuclear Information System (INIS)

    Pace, Samantha J.; Nguyen, Eric; Baria, Maximillian P.; Mojica, Elmer-Rico E.

    2015-01-01

    Linear alkyl alkoxysilanes (methoxy and ethoxy-based) of varying length were used in preparing tetracycline surface imprinted silica xerogels by the sol–gel process. The resulting xerogels were characterized in terms of binding tetracycline (TC) by using tritium-labeled TC. Results showed preferential binding in the ethoxysilane based xerogels in comparison to methoxysilane based xerogels. A computational approach using the interaction energy (IE) between TC and each alkyl alkoxysilane was deduced as a rational way of predicting the formulation that would provide the best analytical performance for a given molecularly imprinted xerogel (MIX). Hartree-Fock calculations revealed an increase in IE as the length of the carbon chain increases until an optimum value at C6 in both alkoxysilanes. This is consistent with the experimental results wherein the C6 xerogel formulation has the highest imprinting factor. These results show the potential of using computational modeling as a rational way of preparing surface imprinted materials. (author)

  10. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...

  11. The numerical computation of seismic fragility of base-isolated Nuclear Power Plants buildings

    International Nuclear Information System (INIS)

    Perotti, Federico; Domaneschi, Marco; De Grandis, Silvia

    2013-01-01

    Highlights: • Seismic fragility of structural components in base isolated NPP is computed. • Dynamic integration, Response Surface, FORM and Monte Carlo Simulation are adopted. • Refined approach for modeling the non-linearities behavior of isolators is proposed. • Beyond-design conditions are addressed. • The preliminary design of the isolated IRIS is the application of the procedure. -- Abstract: The research work here described is devoted to the development of a numerical procedure for the computation of seismic fragilities for equipment and structural components in Nuclear Power Plants; in particular, reference is made, in the present paper, to the case of isolated buildings. The proposed procedure for fragility computation makes use of the Response Surface Methodology to model the influence of the random variables on the dynamic response. To account for stochastic loading, the latter is computed by means of a simulation procedure. Given the Response Surface, the Monte Carlo method is used to compute the failure probability. The procedure is here applied to the preliminary design of the Nuclear Power Plant reactor building within the International Reactor Innovative and Secure international project; the building is equipped with a base isolation system based on the introduction of High Damping Rubber Bearing elements showing a markedly non linear mechanical behavior. The fragility analysis is performed assuming that the isolation devices become the critical elements in terms of seismic risk and that, once base-isolation is introduced, the dynamic behavior of the building can be captured by low-dimensional numerical models

  12. Descriptive and Computer Aided Drawing Perspective on an Unfolded Polyhedral Projection Surface

    Science.gov (United States)

    Dzwierzynska, Jolanta

    2017-10-01

    The aim of the herby study is to develop a method of direct and practical mapping of perspective on an unfolded prism polyhedral projection surface. The considered perspective representation is a rectilinear central projection onto a surface composed of several flat elements. In the paper two descriptive methods of drawing perspective are presented: direct and indirect. The graphical mapping of the effects of the representation is realized directly on the unfolded flat projection surface. That is due to the projective and graphical connection between points displayed on the polyhedral background and their counterparts received on the unfolded flat surface. For a significant improvement of the construction of line, analytical algorithms are formulated. They draw a perspective image of a segment of line passing through two different points determined by their coordinates in a spatial coordinate system of axis x, y, z. Compared to other perspective construction methods that use information about points, for computer vision and the computer aided design, our algorithms utilize data about lines, which are applied very often in architectural forms. Possibility of drawing lines in the considered perspective enables drawing an edge perspective image of an architectural object. The application of the changeable base elements of perspective as a horizon height and a station point location enable drawing perspective image from different viewing positions. The analytical algorithms for drawing perspective images are formulated in Mathcad software, however, they can be implemented in the majority of computer graphical packages, which can make drawing perspective more efficient and easier. The representation presented in the paper and the way of its direct mapping on the flat unfolded projection surface can find application in presentation of architectural space in advertisement and art.

  13. Multivariate Tensor-based Brain Anatomical Surface Morphometry via Holomorphic One-Forms

    OpenAIRE

    Wang, Yalin; Chan, Tony F.; Toga, Arthur W.; Thompson, Paul M.

    2009-01-01

    Here we introduce multivariate tensor-based surface morphometry using holomorphic one-forms to study brain anatomy. We computed new statistics from the Riemannian metric tensors that retain the full information in the deformation tensor fields. We introduce two different holomorphic one-forms that induce different surface conformal parameterizations. We applied this framework to 3D MRI data to analyze hippocampal surface morphometry in Alzheimer’s Disease (AD; 26 subjects), lateral ventricula...

  14. Domain Immersion Technique And Free Surface Computations Applied To Extrusion And Mixing Processes

    Science.gov (United States)

    Valette, Rudy; Vergnes, Bruno; Basset, Olivier; Coupez, Thierry

    2007-04-01

    This work focuses on the development of numerical techniques devoted to the simulation of mixing processes of complex fluids such as twin-screw extrusion or batch mixing. In mixing process simulation, the absence of symmetry of the moving boundaries (the screws or the rotors) implies that their rigid body motion has to be taken into account by using a special treatment. We therefore use a mesh immersion technique (MIT), which consists in using a P1+/P1-based (MINI-element) mixed finite element method for solving the velocity-pressure problem and then solving the problem in the whole barrel cavity by imposing a rigid motion (rotation) to nodes found located inside the so called immersed domain, each subdomain (screw, rotor) being represented by a surface CAD mesh (or its mathematical equation in simple cases). The independent meshes are immersed into a unique backgound computational mesh by computing the distance function to their boundaries. Intersections of meshes are accounted for, allowing to compute a fill factor usable as for the VOF methodology. This technique, combined with the use of parallel computing, allows to compute the time-dependent flow of generalized Newtonian fluids including yield stress fluids in a complex system such as a twin screw extruder, including moving free surfaces, which are treated by a "level set" and Hamilton-Jacobi method.

  15. Experimental and computational studies of positron-stimulated ion desorption from TiO2(1 1 0) surface

    Science.gov (United States)

    Yamashita, T.; Hagiwara, S.; Tachibana, T.; Watanabe, K.; Nagashima, Y.

    2017-11-01

    Experimental and computational studies of the positron-stimulated O+ ion desorption process from a TiO2(1 1 0) surface are reported. The measured data indicate that the O+ ion yields depend on the positron incident energy in the energy range between 0.5 keV and 15 keV. This dependence is closely related to the fraction of positrons which diffuse back to the surface after thermalization in the bulk. Based on the experimental and computational results, we conclude that the ion desorption via positron-stimulation occurs dominantly by the annihilation of surface-trapped positrons with core electrons of the topmost surface atoms.

  16. Capability-based computer systems

    CERN Document Server

    Levy, Henry M

    2014-01-01

    Capability-Based Computer Systems focuses on computer programs and their capabilities. The text first elaborates capability- and object-based system concepts, including capability-based systems, object-based approach, and summary. The book then describes early descriptor architectures and explains the Burroughs B5000, Rice University Computer, and Basic Language Machine. The text also focuses on early capability architectures. Dennis and Van Horn's Supervisor; CAL-TSS System; MIT PDP-1 Timesharing System; and Chicago Magic Number Machine are discussed. The book then describes Plessey System 25

  17. Navier-Stokes Computations With One-Equation Turbulence Model for Flows Along Concave Wall Surfaces

    Science.gov (United States)

    Wang, Chi R.

    2005-01-01

    This report presents the use of a time-marching three-dimensional compressible Navier-Stokes equation numerical solver with a one-equation turbulence model to simulate the flow fields developed along concave wall surfaces without and with a downstream extension flat wall surface. The 3-D Navier- Stokes numerical solver came from the NASA Glenn-HT code. The one-equation turbulence model was derived from the Spalart and Allmaras model. The computational approach was first calibrated with the computations of the velocity and Reynolds shear stress profiles of a steady flat plate boundary layer flow. The computational approach was then used to simulate developing boundary layer flows along concave wall surfaces without and with a downstream extension wall. The author investigated the computational results of surface friction factors, near surface velocity components, near wall temperatures, and a turbulent shear stress component in terms of turbulence modeling, computational mesh configurations, inlet turbulence level, and time iteration step. The computational results were compared with existing measurements of skin friction factors, velocity components, and shear stresses of the developing boundary layer flows. With a fine computational mesh and a one-equation model, the computational approach could predict accurately the skin friction factors, near surface velocity and temperature, and shear stress within the flows. The computed velocity components and shear stresses also showed the vortices effect on the velocity variations over a concave wall. The computed eddy viscosities at the near wall locations were also compared with the results from a two equation turbulence modeling technique. The inlet turbulence length scale was found to have little effect on the eddy viscosities at locations near the concave wall surface. The eddy viscosities, from the one-equation and two-equation modeling, were comparable at most stream-wise stations. The present one

  18. simEye: computer-based simulation of visual perception under various eye defects using Zernike polynomials

    OpenAIRE

    Fink, Wolfgang; Micol, Daniel

    2006-01-01

    We describe a computer eye model that allows for aspheric surfaces and a three-dimensional computer-based ray-tracing technique to simulate optical properties of the human eye and visual perception under various eye defects. Eye surfaces, such as the cornea, eye lens, and retina, are modeled or approximated by a set of Zernike polynomials that are fitted to input data for the respective surfaces. A ray-tracing procedure propagates light rays using Snell’s law of refraction from an input objec...

  19. Microprocessor-based simulator of surface ECG signals

    International Nuclear Information System (INIS)

    MartInez, A E; Rossi, E; Siri, L Nicola

    2007-01-01

    In this work, a simulator of surface electrocardiogram recorded signals (ECG) is presented. The device, based on a microcontroller and commanded by a personal computer, produces an analog signal resembling actual ECGs, not only in time course and voltage levels, but also in source impedance. The simulator is a useful tool for electrocardiograph calibration and monitoring, to incorporate as well in educational tasks and in clinical environments for early detection of faulty behaviour

  20. Computed Potential Energy Surfaces and Minimum Energy Pathways for Chemical Reactions

    Science.gov (United States)

    Walch, Stephen P.; Langhoff, S. R. (Technical Monitor)

    1994-01-01

    Computed potential energy surfaces are often required for computation of such parameters as rate constants as a function of temperature, product branching ratios, and other detailed properties. For some dynamics methods, global potential energy surfaces are required. In this case, it is necessary to obtain the energy at a complete sampling of all the possible arrangements of the nuclei, which are energetically accessible, and then a fitting function must be obtained to interpolate between the computed points. In other cases, characterization of the stationary points and the reaction pathway connecting them is sufficient. These properties may be readily obtained using analytical derivative methods. We have found that computation of the stationary points/reaction pathways using CASSCF/derivative methods, followed by use of the internally contracted CI method to obtain accurate energetics, gives usefull results for a number of chemically important systems. The talk will focus on a number of applications including global potential energy surfaces, H + O2, H + N2, O(3p) + H2, and reaction pathways for complex reactions, including reactions leading to NO and soot formation in hydrocarbon combustion.

  1. Nanoscale phosphorus atom arrays created using STM for the fabrication of a silicon based quantum computer.

    Energy Technology Data Exchange (ETDEWEB)

    O' Brien, J. L. (Jeremy L.); Schofield, S. R. (Steven R.); Simmons, M. Y. (Michelle Y.); Clark, R. G. (Robert G.); Dzurak, A. S. (Andrew S.); Curson, N. J. (Neil J.); Kane, B. E. (Bruce E.); McAlpine, N. S. (Neal S.); Hawley, M. E. (Marilyn E.); Brown, G. W. (Geoffrey W.)

    2001-01-01

    Quantum computers offer the promise of formidable computational power for certain tasks. Of the various possible physical implementations of such a device, silicon based architectures are attractive for their scalability and ease of integration with existing silicon technology. These designs use either the electron or nuclear spin state of single donor atoms to store quantum information. Here we describe a strategy to fabricate an array of single phosphorus atoms in silicon for the construction of such a silicon based quantum computer. We demonstrate the controlled placement of single phosphorus bearing molecules on a silicon surface. This has been achieved by patterning a hydrogen mono-layer 'resist' with a scanning tunneling microscope (STM) tip and exposing the patterned surface to phosphine (PH3) molecules. We also describe preliminary studies into a process to incorporate these surface phosphorus atoms into the silicon crystal at the array sites. Keywords: Quantum computing, nanotechriology scanning turincling microscopy, hydrogen lithography

  2. Evaluation of interatomic potentials for noble gas atoms from rainbow scattering under axial channeling at Ag(1 1 1) surface by computer simulations based on binary collision approximation

    Energy Technology Data Exchange (ETDEWEB)

    Takeuchi, Wataru, E-mail: take@sp.ous.ac.jp

    2016-01-01

    The rainbow angles corresponding to pronounced peaks in the angular distributions of scattered projectiles with small angle, attributed to rainbow scattering (RS), under axial surface channeling conditions are strongly dependent on the interatomic potentials between projectiles and target atoms. The dependence of rainbow angles on normal energy of projectile energy to the target surface that has been experimentally obtained by Schüller and Winter (SW) (2007) for RS of He, Ne and Ar atoms from a Ag(1 1 1) surface with projectile energies of 3–60 keV was evaluated by the three-dimensional computer simulations using the ACOCT code based on the binary collision approximation with interatomic pair potentials. Consequently, the ACOCT results employing the Moliere pair potential with screening length correction close to adjustable one of O’Connor and Biersack (OB) formula are almost in agreement with the experimental ones, being self-consistent with the SW’s ones analyzed by computer simulations of classical trajectory calculations as RS from corrugated equipotential planes based on continuum potentials including the Moliere pair potential with screening length correction of the OB formula.

  3. ZIVIS: A City Computing Platform Based on Volunteer Computing

    International Nuclear Information System (INIS)

    Antoli, B.; Castejon, F.; Giner, A.; Losilla, G.; Reynolds, J. M.; Rivero, A.; Sangiao, S.; Serrano, F.; Tarancon, A.; Valles, R.; Velasco, J. L.

    2007-01-01

    Abstract Volunteer computing has come up as a new form of distributed computing. Unlike other computing paradigms like Grids, which use to be based on complex architectures, volunteer computing has demonstrated a great ability to integrate dispersed, heterogeneous computing resources with ease. This article presents ZIVIS, a project which aims to deploy a city-wide computing platform in Zaragoza (Spain). ZIVIS is based on BOINC (Berkeley Open Infrastructure for Network Computing), a popular open source framework to deploy volunteer and desktop grid computing systems. A scientific code which simulates the trajectories of particles moving inside a stellarator fusion device, has been chosen as the pilot application of the project. In this paper we describe the approach followed to port the code to the BOINC framework as well as some novel techniques, based on standard Grid protocols, we have used to access the output data present in the BOINC server from a remote visualizer. (Author)

  4. Adsorption of molecular additive onto lead halide perovskite surfaces: A computational study on Lewis base thiophene additive passivation

    Science.gov (United States)

    Zhang, Lei; Yu, Fengxi; Chen, Lihong; Li, Jingfa

    2018-06-01

    Organic additives, such as the Lewis base thiophene, have been successfully applied to passivate halide perovskite surfaces, improving the stability and properties of perovskite devices based on CH3NH3PbI3. Yet, the detailed nanostructure of the perovskite surface passivated by additives and the mechanisms of such passivation are not well understood. This study presents a nanoscopic view on the interfacial structure of an additive/perovskite interface, consisting of a Lewis base thiophene molecular additive and a lead halide perovskite surface substrate, providing insights on the mechanisms that molecular additives can passivate the halide perovskite surfaces and enhance the perovskite-based device performance. Molecular dynamics study on the interactions between water molecules and the perovskite surfaces passivated by the investigated additive reveal the effectiveness of employing the molecular additives to improve the stability of the halide perovskite materials. The additive/perovskite surface system is further probed via molecular engineering the perovskite surfaces. This study reveals the nanoscopic structure-property relationships of the halide perovskite surface passivated by molecular additives, which helps the fundamental understanding of the surface/interface engineering strategies for the development of halide perovskite based devices.

  5. A 3D edge detection technique for surface extraction in computed tomography for dimensional metrology applications

    DEFF Research Database (Denmark)

    Yagüe-Fabra, J.A.; Ontiveros, S.; Jiménez, R.

    2013-01-01

    Many factors influence the measurement uncertainty when using computed tomography for dimensional metrology applications. One of the most critical steps is the surface extraction phase. An incorrect determination of the surface may significantly increase the measurement uncertainty. This paper...... presents an edge detection method for the surface extraction based on a 3D Canny algorithm with sub-voxel resolution. The advantages of this method are shown in comparison with the most commonly used technique nowadays, i.e. the local threshold definition. Both methods are applied to reference standards...

  6. Advanced computer-based training

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, H D; Martin, H D

    1987-05-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment.

  7. Advanced computer-based training

    International Nuclear Information System (INIS)

    Fischer, H.D.; Martin, H.D.

    1987-01-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment. (orig.) [de

  8. Computation of stress on the surface of a soft homogeneous arbitrarily shaped particle.

    Science.gov (United States)

    Yang, Minglin; Ren, Kuan Fang; Wu, Yueqian; Sheng, Xinqing

    2014-04-01

    Prediction of the stress on the surface of an arbitrarily shaped particle of soft material is essential in the study of elastic properties of the particles with optical force. It is also necessary in the manipulation and sorting of small particles with optical tweezers, since a regular-shaped particle, such as a sphere, may be deformed under the nonuniform optical stress on its surface. The stress profile on a spherical or small spheroidal soft particle trapped by shaped beams has been studied, however little work on computing the surface stress of an irregular-shaped particle has been reported. We apply in this paper the surface integral equation with multilevel fast multipole algorithm to compute the surface stress on soft homogeneous arbitrarily shaped particles. The comparison of the computed stress profile with that predicted by the generalized Lorenz-Mie theory for a water droplet of diameter equal to 51 wavelengths in a focused Gaussian beam show that the precision of our method is very good. Then stress profiles on spheroids with different aspect ratios are computed. The particles are illuminated by a Gaussian beam of different waist radius at different incidences. Physical analysis on the mechanism of optical stress is given with help of our recently developed vectorial complex ray model. It is found that the maximum of the stress profile on the surface of prolate spheroids is not only determined by the reflected and refracted rays (orders p=0,1) but also the rays undergoing one or two internal reflections where they focus. Computational study of stress on surface of a biconcave cell-like particle, which is a typical application in life science, is also undertaken.

  9. Multivariate tensor-based brain anatomical surface morphometry via holomorphic one-forms.

    Science.gov (United States)

    Wang, Yalin; Chan, Tony F; Toga, Arthur W; Thompson, Paul M

    2009-01-01

    Here we introduce multivariate tensor-based surface morphometry using holomorphic one-forms to study brain anatomy. We computed new statistics from the Riemannian metric tensors that retain the full information in the deformation tensor fields. We introduce two different holomorphic one-forms that induce different surface conformal parameterizations. We applied this framework to 3D MRI data to analyze hippocampal surface morphometry in Alzheimer's Disease (AD; 26 subjects), lateral ventricular surface morphometry in HIV/AIDS (19 subjects) and cortical surface morphometry in Williams Syndrome (WS; 80 subjects). Experimental results demonstrated that our method powerfully detected brain surface abnormalities. Multivariate statistics on the local tensors outperformed other TBM methods including analysis of the Jacobian determinant, the largest eigenvalue, or the pair of eigenvalues, of the surface Jacobian matrix.

  10. Computational optimization of biodiesel combustion using response surface methodology

    Directory of Open Access Journals (Sweden)

    Ganji Prabhakara Rao

    2017-01-01

    Full Text Available The present work focuses on optimization of biodiesel combustion phenomena through parametric approach using response surface methodology. Physical properties of biodiesel play a vital role for accurate simulations of the fuel spray, atomization, combustion, and emission formation processes. Typically methyl based biodiesel consists of five main types of esters: methyl palmitate, methyl oleate, methyl stearate, methyl linoleate, and methyl linolenate in its composition. Based on the amount of methyl esters present the properties of pongamia bio-diesel and its blends were estimated. CONVERGETM computational fluid dynamics software was used to simulate the fuel spray, turbulence and combustion phenomena. The simulation responses such as indicated specific fuel consumption, NOx, and soot were analyzed using design of experiments. Regression equations were developed for each of these responses. The optimum parameters were found out to be compression ratio – 16.75, start of injection – 21.9° before top dead center, and exhaust gas re-circulation – 10.94%. Results have been compared with baseline case.

  11. Geometrical error calibration in reflective surface testing based on reverse Hartmann test

    Science.gov (United States)

    Gong, Zhidong; Wang, Daodang; Xu, Ping; Wang, Chao; Liang, Rongguang; Kong, Ming; Zhao, Jun; Mo, Linhai; Mo, Shuhui

    2017-08-01

    In the fringe-illumination deflectometry based on reverse-Hartmann-test configuration, ray tracing of the modeled testing system is performed to reconstruct the test surface error. Careful calibration of system geometry is required to achieve high testing accuracy. To realize the high-precision surface testing with reverse Hartmann test, a computer-aided geometrical error calibration method is proposed. The aberrations corresponding to various geometrical errors are studied. With the aberration weights for various geometrical errors, the computer-aided optimization of system geometry with iterative ray tracing is carried out to calibration the geometrical error, and the accuracy in the order of subnanometer is achieved.

  12. Computational Sensing of Staphylococcus aureus on Contact Lenses Using 3D Imaging of Curved Surfaces and Machine Learning.

    Science.gov (United States)

    Veli, Muhammed; Ozcan, Aydogan

    2018-03-27

    We present a cost-effective and portable platform based on contact lenses for noninvasively detecting Staphylococcus aureus, which is part of the human ocular microbiome and resides on the cornea and conjunctiva. Using S. aureus-specific antibodies and a surface chemistry protocol that is compatible with human tears, contact lenses are designed to specifically capture S. aureus. After the bacteria capture on the lens and right before its imaging, the captured bacteria are tagged with surface-functionalized polystyrene microparticles. These microbeads provide sufficient signal-to-noise ratio for the quantification of the captured bacteria on the contact lens, without any fluorescent labels, by 3D imaging of the curved surface of each lens using only one hologram taken with a lens-free on-chip microscope. After the 3D surface of the contact lens is computationally reconstructed using rotational field transformations and holographic digital focusing, a machine learning algorithm is employed to automatically count the number of beads on the lens surface, revealing the count of the captured bacteria. To demonstrate its proof-of-concept, we created a field-portable and cost-effective holographic microscope, which weighs 77 g, controlled by a laptop. Using daily contact lenses that are spiked with bacteria, we demonstrated that this computational sensing platform provides a detection limit of ∼16 bacteria/μL. This contact-lens-based wearable sensor can be broadly applicable to detect various bacteria, viruses, and analytes in tears using a cost-effective and portable computational imager that might be used even at home by consumers.

  13. Computer-based multi-channel analyzer based on internet

    International Nuclear Information System (INIS)

    Zhou Xinzhi; Ning Jiaoxian

    2001-01-01

    Combined the technology of Internet with computer-based multi-channel analyzer, a new kind of computer-based multi-channel analyzer system which is based on browser is presented. Its framework and principle as well as its implementation are discussed

  14. The advantages of the surface Laplacian in brain-computer interface research.

    Science.gov (United States)

    McFarland, Dennis J

    2015-09-01

    Brain-computer interface (BCI) systems frequently use signal processing methods, such as spatial filtering, to enhance performance. The surface Laplacian can reduce spatial noise and aid in identification of sources. In BCI research, these two functions of the surface Laplacian correspond to prediction accuracy and signal orthogonality. In the present study, an off-line analysis of data from a sensorimotor rhythm-based BCI task dissociated these functions of the surface Laplacian by comparing nearest-neighbor and next-nearest neighbor Laplacian algorithms. The nearest-neighbor Laplacian produced signals that were more orthogonal while the next-nearest Laplacian produced signals that resulted in better accuracy. Both prediction and signal identification are important for BCI research. Better prediction of user's intent produces increased speed and accuracy of communication and control. Signal identification is important for ruling out the possibility of control by artifacts. Identifying the nature of the control signal is relevant both to understanding exactly what is being studied and in terms of usability for individuals with limited motor control. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Direct Monte Carlo dose calculation using polygon-surface computational human model

    International Nuclear Information System (INIS)

    Jeong, Jong Hwi; Kim, Chan Hyeong; Yeom, Yeon Su; Cho, Sungkoo; Chung, Min Suk; Cho, Kun-Woo

    2011-01-01

    In the present study, a voxel-type computational human model was converted to a polygon-surface model, after which it was imported directly to the Geant4 code without using a voxelization process, that is, without converting back to a voxel model. The original voxel model was also imported to the Geant4 code, in order to compare the calculated dose values and the computational speed. The average polygon size of the polygon-surface model was ∼0.5 cm 2 , whereas the voxel resolution of the voxel model was 1.981 × 1.981 × 2.0854 mm 3 . The results showed a good agreement between the calculated dose values of the two models. The polygon-surface model was, however, slower than the voxel model by a factor of 6–9 for the photon energies and irradiation geometries considered in the present study, which nonetheless is considered acceptable, considering that direct use of the polygon-surface model does not require a separate voxelization process. (author)

  16. Computation of Surface Laplacian for tri-polar ring electrodes on high-density realistic geometry head model.

    Science.gov (United States)

    Junwei Ma; Han Yuan; Sunderam, Sridhar; Besio, Walter; Lei Ding

    2017-07-01

    Neural activity inside the human brain generate electrical signals that can be detected on the scalp. Electroencephalograph (EEG) is one of the most widely utilized techniques helping physicians and researchers to diagnose and understand various brain diseases. Due to its nature, EEG signals have very high temporal resolution but poor spatial resolution. To achieve higher spatial resolution, a novel tri-polar concentric ring electrode (TCRE) has been developed to directly measure Surface Laplacian (SL). The objective of the present study is to accurately calculate SL for TCRE based on a realistic geometry head model. A locally dense mesh was proposed to represent the head surface, where the local dense parts were to match the small structural components in TCRE. Other areas without dense mesh were used for the purpose of reducing computational load. We conducted computer simulations to evaluate the performance of the proposed mesh and evaluated possible numerical errors as compared with a low-density model. Finally, with achieved accuracy, we presented the computed forward lead field of SL for TCRE for the first time in a realistic geometry head model and demonstrated that it has better spatial resolution than computed SL from classic EEG recordings.

  17. Automatic vertebral identification using surface-based registration

    Science.gov (United States)

    Herring, Jeannette L.; Dawant, Benoit M.

    2000-06-01

    This work introduces an enhancement to currently existing methods of intra-operative vertebral registration by allowing the portion of the spinal column surface that correctly matches a set of physical vertebral points to be automatically selected from several possible choices. Automatic selection is made possible by the shape variations that exist among lumbar vertebrae. In our experiments, we register vertebral points representing physical space to spinal column surfaces extracted from computed tomography images. The vertebral points are taken from the posterior elements of a single vertebra to represent the region of surgical interest. The surface is extracted using an improved version of the fully automatic marching cubes algorithm, which results in a triangulated surface that contains multiple vertebrae. We find the correct portion of the surface by registering the set of physical points to multiple surface areas, including all vertebral surfaces that potentially match the physical point set. We then compute the standard deviation of the surface error for the set of points registered to each vertebral surface that is a possible match, and the registration that corresponds to the lowest standard deviation designates the correct match. We have performed our current experiments on two plastic spine phantoms and one patient.

  18. An assessment of the effectiveness of computer-based training for newly commissioned Surface Warfare Division officers. / by William R. Bowman, Crawford, Alice M., Stephen Mehay.

    OpenAIRE

    Bowman, William R.; Crawford, Alice M.; Mehay, Stephen

    2009-01-01

    Approved for public release; distribution unlimited. The goal of this study was to analyze the effectiveness of the new SWOS-at-Sea training for newly commissioned surface warfare officers that was introduced in 2003. The new regime combined self-paced computer-based training (CBT) with on-the-job training (OJT) on-board an officer's ship. The study relied on a variety of analytical techniques, including a literature review of CBT and OJT training, interviews and focus groups with junior a...

  19. Uncertainty Estimate of Surface Irradiances Computed with MODIS-, CALIPSO-, and CloudSat-Derived Cloud and Aerosol Properties

    Science.gov (United States)

    Kato, Seiji; Loeb, Norman G.; Rutan, David A.; Rose, Fred G.; Sun-Mack, Sunny; Miller, Walter F.; Chen, Yan

    2012-07-01

    Differences of modeled surface upward and downward longwave and shortwave irradiances are calculated using modeled irradiance computed with active sensor-derived and passive sensor-derived cloud and aerosol properties. The irradiance differences are calculated for various temporal and spatial scales, monthly gridded, monthly zonal, monthly global, and annual global. Using the irradiance differences, the uncertainty of surface irradiances is estimated. The uncertainty (1σ) of the annual global surface downward longwave and shortwave is, respectively, 7 W m-2 (out of 345 W m-2) and 4 W m-2 (out of 192 W m-2), after known bias errors are removed. Similarly, the uncertainty of the annual global surface upward longwave and shortwave is, respectively, 3 W m-2 (out of 398 W m-2) and 3 W m-2 (out of 23 W m-2). The uncertainty is for modeled irradiances computed using cloud properties derived from imagers on a sun-synchronous orbit that covers the globe every day (e.g., moderate-resolution imaging spectrometer) or modeled irradiances computed for nadir view only active sensors on a sun-synchronous orbit such as Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation and CloudSat. If we assume that longwave and shortwave uncertainties are independent of each other, but up- and downward components are correlated with each other, the uncertainty in global annual mean net surface irradiance is 12 W m-2. One-sigma uncertainty bounds of the satellite-based net surface irradiance are 106 W m-2 and 130 W m-2.

  20. Computer-aided design of curved surfaces with automatic model generation

    Science.gov (United States)

    Staley, S. M.; Jerard, R. B.; White, P. R.

    1980-01-01

    The design and visualization of three-dimensional objects with curved surfaces have always been difficult. The paper given below describes a computer system which facilitates both the design and visualization of such surfaces. The system enhances the design of these surfaces by virtue of various interactive techniques coupled with the application of B-Spline theory. Visualization is facilitated by including a specially built model-making machine which produces three-dimensional foam models. Thus, the system permits the designer to produce an inexpensive model of the object which is suitable for evaluation and presentation.

  1. The Dimensions of the Orbital Cavity Based on High-Resolution Computed Tomography of Human Cadavers

    DEFF Research Database (Denmark)

    Felding, Ulrik Ascanius; Bloch, Sune Land; Buchwald, Christian von

    2016-01-01

    for surface area. To authors' knowledge, this study is the first to have measured the entire surface area of the orbital cavity.The volume and surface area of the orbital cavity were estimated in computed tomography scans of 11 human cadavers using unbiased stereological sampling techniques. The mean (± SD......) total volume and total surface area of the orbital cavities was 24.27 ± 3.88 cm and 32.47 ± 2.96 cm, respectively. There was no significant difference in volume (P = 0.315) or surface area (P = 0.566) between the 2 orbital cavities.The stereological technique proved to be a robust and unbiased method...... that may be used as a gold standard for comparison with automated computer software. Future imaging studies in blow-out fracture patients may be based on individual and relative calculation involving both herniated volume and fractured surface area in relation to the total volume and surface area...

  2. An adhesive contact mechanics formulation based on atomistically induced surface traction

    Energy Technology Data Exchange (ETDEWEB)

    Fan, Houfu [Department of Civil and Environmental Engineering, University of California, Berkeley, CA 94720 (United States); Ren, Bo [Livermore Software Technology Corporation, 7374 Las Positas Road, Livermore, CA 94551 (United States); Li, Shaofan, E-mail: shaofan@berkeley.edu [Department of Civil and Environmental Engineering, University of California, Berkeley, CA 94720 (United States)

    2015-12-01

    In this work, we have developed a novel multiscale computational contact formulation based on the generalized Derjuguin approximation for continua that are characterized by atomistically enriched constitutive relations in order to study macroscopic interaction between arbitrarily shaped deformable continua. The proposed adhesive contact formulation makes use of the microscopic interaction forces between individual particles in the interacting bodies. In particular, the double-layer volume integral describing the contact interaction (energy, force vector, matrix) is converted into a double-layer surface integral through a mathematically consistent approach that employs the divergence theorem and a special partitioning technique. The proposed contact model is formulated in the nonlinear continuum mechanics framework and implemented using the standard finite element method. With no large penalty constant, the stiffness matrix of the system will in general be well-conditioned, which is of great significance for quasi-static analysis. Three numerical examples are presented to illustrate the capability of the proposed method. Results indicate that with the same mesh configuration, the finite element computation based on the surface integral approach is faster and more accurate than the volume integral based approach. In addition, the proposed approach is energy preserving even in a very long dynamic simulation.

  3. A Computational Study of Richtmyer-Meshkov Instability with Surface Tension

    Science.gov (United States)

    Francois, Marianne; Velechovsky, Jan; Jibben, Zach; Masser, Thomas; LANL Collaboration

    2017-11-01

    We have added the capability to model surface tension in our adaptive mesh refinement compressible flow solver, xRage. Our surface tension capability employs the continuum surface force to model surface tension and the height function method to compute curvatures. We have verified our model implementation for the static and oscillating droplets test cases and the linear regime of the Rayleigh-Taylor instability. With this newly added capability, we have performed a numerical study of the effects of surface tension on single-mode and multi-mode Richtmyer-Meshkov instability. This work was performed under the auspices of the National Nuclear Security Administration of the U.S. Department of Energy at Los Alamos National Laboratory under Contract No. DE-AC52 - 06NA25396.

  4. Guided access cavity preparation using cone-beam computed tomography and optical surface scans - an ex vivo study

    DEFF Research Database (Denmark)

    Buchgreitz, J; Buchgreitz, M; Mortensen, D

    2016-01-01

    AIM: To evaluate ex vivo, the accuracy of a preparation procedure planned for teeth with pulp canal obliteration (PCO) using a guide rail concept based on a cone-beam computed tomography (CBCT) scan merged with an optical surface scan. METHODOLOGY: A total of 48 teeth were mounted in acrylic bloc...

  5. Adaptive local surface refinement based on LR NURBS and its application to contact

    Science.gov (United States)

    Zimmermann, Christopher; Sauer, Roger A.

    2017-12-01

    A novel adaptive local surface refinement technique based on Locally Refined Non-Uniform Rational B-Splines (LR NURBS) is presented. LR NURBS can model complex geometries exactly and are the rational extension of LR B-splines. The local representation of the parameter space overcomes the drawback of non-existent local refinement in standard NURBS-based isogeometric analysis. For a convenient embedding into general finite element codes, the Bézier extraction operator for LR NURBS is formulated. An automatic remeshing technique is presented that allows adaptive local refinement and coarsening of LR NURBS. In this work, LR NURBS are applied to contact computations of 3D solids and membranes. For solids, LR NURBS-enriched finite elements are used to discretize the contact surfaces with LR NURBS finite elements, while the rest of the body is discretized by linear Lagrange finite elements. For membranes, the entire surface is discretized by LR NURBS. Various numerical examples are shown, and they demonstrate the benefit of using LR NURBS: Compared to uniform refinement, LR NURBS can achieve high accuracy at lower computational cost.

  6. Computational model of surface ablation from tokamak disruptions

    International Nuclear Information System (INIS)

    Ehst, D.; Hassanein, A.

    1993-10-01

    Energy transfer to material surfaces is dominated by photon radiation through low temperature plasma vapors if tokamak disruptions are due to low kinetic energy particles ( < 100 eV). Simple models of radiation transport are derived and incorporated into a fast-running computer routine to model this process. The results of simulations are in fair agreement with plasma gun erosion tests on several metal targets

  7. Efficient and Adaptive Methods for Computing Accurate Potential Surfaces for Quantum Nuclear Effects: Applications to Hydrogen-Transfer Reactions.

    Science.gov (United States)

    DeGregorio, Nicole; Iyengar, Srinivasan S

    2018-01-09

    We present two sampling measures to gauge critical regions of potential energy surfaces. These sampling measures employ (a) the instantaneous quantum wavepacket density, an approximation to the (b) potential surface, its (c) gradients, and (d) a Shannon information theory based expression that estimates the local entropy associated with the quantum wavepacket. These four criteria together enable a directed sampling of potential surfaces that appears to correctly describe the local oscillation frequencies, or the local Nyquist frequency, of a potential surface. The sampling functions are then utilized to derive a tessellation scheme that discretizes the multidimensional space to enable efficient sampling of potential surfaces. The sampled potential surface is then combined with four different interpolation procedures, namely, (a) local Hermite curve interpolation, (b) low-pass filtered Lagrange interpolation, (c) the monomial symmetrization approximation (MSA) developed by Bowman and co-workers, and (d) a modified Shepard algorithm. The sampling procedure and the fitting schemes are used to compute (a) potential surfaces in highly anharmonic hydrogen-bonded systems and (b) study hydrogen-transfer reactions in biogenic volatile organic compounds (isoprene) where the transferring hydrogen atom is found to demonstrate critical quantum nuclear effects. In the case of isoprene, the algorithm discussed here is used to derive multidimensional potential surfaces along a hydrogen-transfer reaction path to gauge the effect of quantum-nuclear degrees of freedom on the hydrogen-transfer process. Based on the decreased computational effort, facilitated by the optimal sampling of the potential surfaces through the use of sampling functions discussed here, and the accuracy of the associated potential surfaces, we believe the method will find great utility in the study of quantum nuclear dynamics problems, of which application to hydrogen-transfer reactions and hydrogen

  8. Knowledge-based computer security advisor

    International Nuclear Information System (INIS)

    Hunteman, W.J.; Squire, M.B.

    1991-01-01

    The rapid expansion of computer security information and technology has included little support to help the security officer identify the safeguards needed to comply with a policy and to secure a computing system. This paper reports that Los Alamos is developing a knowledge-based computer security system to provide expert knowledge to the security officer. This system includes a model for expressing the complex requirements in computer security policy statements. The model is part of an expert system that allows a security officer to describe a computer system and then determine compliance with the policy. The model contains a generic representation that contains network relationships among the policy concepts to support inferencing based on information represented in the generic policy description

  9. Computed Potential Energy Surfaces and Minimum Energy Pathway for Chemical Reactions

    Science.gov (United States)

    Walch, Stephen P.; Langhoff, S. R. (Technical Monitor)

    1994-01-01

    Computed potential energy surfaces are often required for computation of such observables as rate constants as a function of temperature, product branching ratios, and other detailed properties. We have found that computation of the stationary points/reaction pathways using CASSCF/derivative methods, followed by use of the internally contracted CI method with the Dunning correlation consistent basis sets to obtain accurate energetics, gives useful results for a number of chemically important systems. Applications to complex reactions leading to NO and soot formation in hydrocarbon combustion are discussed.

  10. On the theory and computation of surface tension: The elimination of parasitic currents through energy conservation in the second-gradient method

    International Nuclear Information System (INIS)

    Jamet, Didier; Torres, David; Brackbill, J.U.

    2002-01-01

    Errors in the computation of fluid flows with surface tension are examined. These errors cause large parasitic flows when the capillary number is large and have often been attributed to truncation error in underresolved interfacial regions. A study using the second-gradient method reveals that when truncation error is eliminated in the computation of energy exchanges between surface and kinetic energies so that energy is strictly conserved, the parasitic currents are reduced to round-off. The results are based on general thermodynamic arguments and can be used to guide improvements in other methods, such as the continuum-surface-force (CSF) method, which is commonly used with the volume-of-fluid (VOF) method

  11. On the theory and computation of surface tension: The elimination of parasitic currents through energy conservation in the second-gradient method

    CERN Document Server

    Jamet, D; Brackbill, J U

    2002-01-01

    Errors in the computation of fluid flows with surface tension are examined. These errors cause large parasitic flows when the capillary number is large and have often been attributed to truncation error in underresolved interfacial regions. A study using the second-gradient method reveals that when truncation error is eliminated in the computation of energy exchanges between surface and kinetic energies so that energy is strictly conserved, the parasitic currents are reduced to round-off. The results are based on general thermodynamic arguments and can be used to guide improvements in other methods, such as the continuum-surface-force (CSF) method, which is commonly used with the volume-of-fluid (VOF) method.

  12. Modification of silicon nitride surfaces with GOPES and APTES for antibody immobilization: computational and experimental studies

    International Nuclear Information System (INIS)

    To, Thien Dien; Nguyen, Anh Tuan; Phan, Khoa Nhat Thanh; Truong, An Thu Thi; Doan, Tin Chanh Duc; Dang, Chien Mau

    2015-01-01

    Chemical modification of silicon nitride (SiN) surfaces by silanization has been widely studied especially with 3-(aminopropyl)triethoxysilane (APTES) and 3-(glycidyloxypropyl) dimethylethoxysilane (GOPES). However few reports performed the experimental and computational studies together. In this study, surface modification of SiN surfaces with GOPES and APTES covalently bound with glutaraldehyde (GTA) was investigated for antibody immobilization. The monoclonal anti-cytokeratin-FITC (MACF) antibody was immobilized on the modified SiN surfaces. The modified surfaces were characterized by water contact angle measurements, atomic force microscopy and fluorescence microscopy. The FITC-fluorescent label indicated the existence of MACF antibody on the SiN surfaces and the efficiency of the silanization reaction. Absorption of APTES and GOPES on the oxidized SiN surfaces was computationally modeled and calculated by Materials Studio software. The computational and experimental results showed that modification of the SiN surfaces with APTES and GTA was more effective than the modification with GOPES. (paper)

  13. Identity-Based Authentication for Cloud Computing

    Science.gov (United States)

    Li, Hongwei; Dai, Yuanshun; Tian, Ling; Yang, Haomiao

    Cloud computing is a recently developed new technology for complex systems with massive-scale services sharing among numerous users. Therefore, authentication of both users and services is a significant issue for the trust and security of the cloud computing. SSL Authentication Protocol (SAP), once applied in cloud computing, will become so complicated that users will undergo a heavily loaded point both in computation and communication. This paper, based on the identity-based hierarchical model for cloud computing (IBHMCC) and its corresponding encryption and signature schemes, presented a new identity-based authentication protocol for cloud computing and services. Through simulation testing, it is shown that the authentication protocol is more lightweight and efficient than SAP, specially the more lightweight user side. Such merit of our model with great scalability is very suited to the massive-scale cloud.

  14. Computer aided fixture design - A case based approach

    Science.gov (United States)

    Tanji, Shekhar; Raiker, Saiesh; Mathew, Arun Tom

    2017-11-01

    Automated fixture design plays important role in process planning and integration of CAD and CAM. An automated fixture setup design system is developed where when fixturing surfaces and points are described allowing modular fixture components to get automatically select for generating fixture units and placed into position with satisfying assembled conditions. In past, various knowledge based system have been developed to implement CAFD in practice. In this paper, to obtain an acceptable automated machining fixture design, a case-based reasoning method with developed retrieval system is proposed. Visual Basic (VB) programming language is used in integrating with SolidWorks API (Application programming interface) module for better retrieval procedure reducing computational time. These properties are incorporated in numerical simulation to determine the best fit for practical use.

  15. Relationship Between Ocular Surface Disease Index, Dry Eye Tests, and Demographic Properties in Computer Users

    Directory of Open Access Journals (Sweden)

    Hüseyin Simavlı

    2014-03-01

    Full Text Available Objectives: The aim of the present study is to evaluate the ocular surface disease index (OSDI in computer users and to investigate the correlations of this index with dry eye tests and demographic properties. Materials and Methods: In this prospective study, 178 subjects with an age range of 20-40 years and who spent most of their daily life in front of the computers were included. All participants underwent a complete ophthalmologic examination including basal secretion test, tear break-up time test, and ocular surface staining. In addition, all patients completed the OSDI questionnaire. Results: A total of 178 volunteers (101 female, 77 male with a mean age of 28.8±4.5 years were included in the study. Mean time of computer use was 7.7±1.9 (5-14 hours/day, and mean computer use period was 71.1±39.7 (4-204 months. Mean OSDI score was 44.1±24.7 (0-100. There was a significant negative correlation between the OSDI score and tear break-up time test in the right (p=0.005 r=-0.21 and the left eyes (p=0.003 r=-0.22. There was a significant positive correlation between the OSDI score and gender (p=0.014 r=0.18 and daily computer usage time (p=0.008 r=0.2. In addition to this, there was a significant positive correlation between the OSDI score and ocular surface staining pattern in the right (p=0.03 r=0.16 and the left eyes (p=0.03 r=0.17. Age, smoking, type of computer, use of glasses, presence of symptoms, and basal secretion test were not found to be correlated with OSDI score. Conclusions: Long-term computer use causes ocular surface problems. The OSDI were found to be correlated with tear break-up time test, gender, daily computer usage time, and ocular surface staining pattern in computer users. (Turk J Ophthalmol 2014; 44: 115-8

  16. Spintronics-based computing

    CERN Document Server

    Prenat, Guillaume

    2015-01-01

    This book provides a comprehensive introduction to spintronics-based computing for the next generation of ultra-low power/highly reliable logic, which is widely considered a promising candidate to replace conventional, pure CMOS-based logic. It will cover aspects from device to system-level, including magnetic memory cells, device modeling, hybrid circuit structure, design methodology, CAD tools, and technological integration methods. This book is accessible to a variety of readers and little or no background in magnetism and spin electronics are required to understand its content.  The multidisciplinary team of expert authors from circuits, devices, computer architecture, CAD and system design reveal to readers the potential of spintronics nanodevices to reduce power consumption, improve reliability and enable new functionality.  .

  17. Effect of Non-Equilibrium Surface Thermochemistry in Simulation of Carbon Based Ablators

    Science.gov (United States)

    Chen, Yih-Kanq; Gokcen, Tahir

    2012-01-01

    This study demonstrates that coupling of a material thermal response code and a flow solver using non-equilibrium gas/surface interaction model provides time-accurate solutions for the multidimensional ablation of carbon based charring ablators. The material thermal response code used in this study is the Two-dimensional Implicit Thermal-response and AblatioN Program (TITAN), which predicts charring material thermal response and shape change on hypersonic space vehicles. Its governing equations include total energy balance, pyrolysis gas mass conservation, and a three-component decomposition model. The flow code solves the reacting Navier-Stokes equations using Data Parallel Line Relaxation (DPLR) method. Loose coupling between the material response and flow codes is performed by solving the surface mass balance in DPLR and the surface energy balance in TITAN. Thus, the material surface recession is predicted by finite-rate gas/surface interaction boundary conditions implemented in DPLR, and the surface temperature and pyrolysis gas injection rate are computed in TITAN. Two sets of nonequilibrium gas/surface interaction chemistry between air and the carbon surface developed by Park and Zhluktov, respectively, are studied. Coupled fluid-material response analyses of stagnation tests conducted in NASA Ames Research Center arc-jet facilities are considered. The ablating material used in these arc-jet tests was Phenolic Impregnated Carbon Ablator (PICA). Computational predictions of in-depth material thermal response and surface recession are compared with the experimental measurements for stagnation cold wall heat flux ranging from 107 to 1100 Watts per square centimeter.

  18. A Computer-Based Simulation of an Acid-Base Titration

    Science.gov (United States)

    Boblick, John M.

    1971-01-01

    Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)

  19. Computer based system for measuring the minority carrier lifetime in the solar cells

    International Nuclear Information System (INIS)

    Morales A, A.; Casados C, G.

    1994-01-01

    We show the development of a computer based system for measuring the minority carrier lifetime in the base of silicon solar cells. The system allows using two different techniques for such kind of measurements:the open circuit voltage decay (OCVD) and the surface voltage decay SVD. The equipment is based on internal cards for IBM-Pc or compatible computers that work as an oscilloscope and as a function generator, in addition to a synchronization and signal conditioning circuit. The system is fully controlled by a 'c' language program that optimizes the used of the instrument built in this way, and makes the analysis of the measurement data by curve fitting techniques. We show typical results obtained with silicon solar cells made in our laboratories. (Author)

  20. Multidimensional control using a mobile-phone based brain-muscle-computer interface.

    Science.gov (United States)

    Vernon, Scott; Joshi, Sanjay S

    2011-01-01

    Many well-known brain-computer interfaces measure signals at the brain, and then rely on the brain's ability to learn via operant conditioning in order to control objects in the environment. In our lab, we have been developing brain-muscle-computer interfaces, which measure signals at a single muscle and then rely on the brain's ability to learn neuromuscular skills via operant conditioning. Here, we report a new mobile-phone based brain-muscle-computer interface prototype for severely paralyzed persons, based on previous results from our group showing that humans may actively create specified power levels in two separate frequency bands of a single sEMG signal. Electromyographic activity on the surface of a single face muscle (Auricularis superior) is recorded with a standard electrode. This analog electrical signal is imported into an Android-based mobile phone. User-modulated power in two separate frequency band serves as two separate and simultaneous control channels for machine control. After signal processing, the Android phone sends commands to external devices via Bluetooth. Users are trained to use the device via biofeedback, with simple cursor-to-target activities on the phone screen.

  1. Topological Superconductivity on the Surface of Fe-Based Superconductors.

    Science.gov (United States)

    Xu, Gang; Lian, Biao; Tang, Peizhe; Qi, Xiao-Liang; Zhang, Shou-Cheng

    2016-07-22

    As one of the simplest systems for realizing Majorana fermions, the topological superconductor plays an important role in both condensed matter physics and quantum computations. Based on ab initio calculations and the analysis of an effective 8-band model with superconducting pairing, we demonstrate that the three-dimensional extended s-wave Fe-based superconductors such as Fe_{1+y}Se_{0.5}Te_{0.5} have a metallic topologically nontrivial band structure, and exhibit a normal-topological-normal superconductivity phase transition on the (001) surface by tuning the bulk carrier doping level. In the topological superconductivity (TSC) phase, a Majorana zero mode is trapped at the end of a magnetic vortex line. We further show that the surface TSC phase only exists up to a certain bulk pairing gap, and there is a normal-topological phase transition driven by the temperature, which has not been discussed before. These results pave an effective way to realize the TSC and Majorana fermions in a large class of superconductors.

  2. A Collaborative Approach for Surface Inspection Using Aerial Robots and Computer Vision

    Directory of Open Access Journals (Sweden)

    Martin Molina

    2018-03-01

    Full Text Available Aerial robots with cameras on board can be used in surface inspection to observe areas that are difficult to reach by other means. In this type of problem, it is desirable for aerial robots to have a high degree of autonomy. A way to provide more autonomy would be to use computer vision techniques to automatically detect anomalies on the surface. However, the performance of automated visual recognition methods is limited in uncontrolled environments, so that in practice it is not possible to perform a fully automatic inspection. This paper presents a solution for visual inspection that increases the degree of autonomy of aerial robots following a semi-automatic approach. The solution is based on human-robot collaboration in which the operator delegates tasks to the drone for exploration and visual recognition and the drone requests assistance in the presence of uncertainty. We validate this proposal with the development of an experimental robotic system using the software framework Aerostack. The paper describes technical challenges that we had to solve to develop such a system and the impact on this solution on the degree of autonomy to detect anomalies on the surface.

  3. Computer screen photo-excited surface plasmon resonance imaging.

    Science.gov (United States)

    Filippini, Daniel; Winquist, Fredrik; Lundström, Ingemar

    2008-09-12

    Angle and spectra resolved surface plasmon resonance (SPR) images of gold and silver thin films with protein deposits is demonstrated using a regular computer screen as light source and a web camera as detector. The screen provides multiple-angle illumination, p-polarized light and controlled spectral radiances to excite surface plasmons in a Kretchmann configuration. A model of the SPR reflectances incorporating the particularities of the source and detector explain the observed signals and the generation of distinctive SPR landscapes is demonstrated. The sensitivity and resolution of the method, determined in air and solution, are 0.145 nm pixel(-1), 0.523 nm, 5.13x10(-3) RIU degree(-1) and 6.014x10(-4) RIU, respectively, encouraging results at this proof of concept stage and considering the ubiquity of the instrumentation.

  4. Contributions of computational chemistry and biophysical techniques to fragment-based drug discovery.

    Science.gov (United States)

    Gozalbes, Rafael; Carbajo, Rodrigo J; Pineda-Lucena, Antonio

    2010-01-01

    In the last decade, fragment-based drug discovery (FBDD) has evolved from a novel approach in the search of new hits to a valuable alternative to the high-throughput screening (HTS) campaigns of many pharmaceutical companies. The increasing relevance of FBDD in the drug discovery universe has been concomitant with an implementation of the biophysical techniques used for the detection of weak inhibitors, e.g. NMR, X-ray crystallography or surface plasmon resonance (SPR). At the same time, computational approaches have also been progressively incorporated into the FBDD process and nowadays several computational tools are available. These stretch from the filtering of huge chemical databases in order to build fragment-focused libraries comprising compounds with adequate physicochemical properties, to more evolved models based on different in silico methods such as docking, pharmacophore modelling, QSAR and virtual screening. In this paper we will review the parallel evolution and complementarities of biophysical techniques and computational methods, providing some representative examples of drug discovery success stories by using FBDD.

  5. Computer-based and web-based radiation safety training

    Energy Technology Data Exchange (ETDEWEB)

    Owen, C., LLNL

    1998-03-01

    The traditional approach to delivering radiation safety training has been to provide a stand-up lecture of the topic, with the possible aid of video, and to repeat the same material periodically. New approaches to meeting training requirements are needed to address the advent of flexible work hours and telecommuting, and to better accommodate individuals learning at their own pace. Computer- based and web-based radiation safety training can provide this alternative. Computer-based and web- based training is an interactive form of learning that the student controls, resulting in enhanced and focused learning at a time most often chosen by the student.

  6. Benchmarking gate-based quantum computers

    Science.gov (United States)

    Michielsen, Kristel; Nocon, Madita; Willsch, Dennis; Jin, Fengping; Lippert, Thomas; De Raedt, Hans

    2017-11-01

    With the advent of public access to small gate-based quantum processors, it becomes necessary to develop a benchmarking methodology such that independent researchers can validate the operation of these processors. We explore the usefulness of a number of simple quantum circuits as benchmarks for gate-based quantum computing devices and show that circuits performing identity operations are very simple, scalable and sensitive to gate errors and are therefore very well suited for this task. We illustrate the procedure by presenting benchmark results for the IBM Quantum Experience, a cloud-based platform for gate-based quantum computing.

  7. Attack surfaces

    DEFF Research Database (Denmark)

    Gruschka, Nils; Jensen, Meiko

    2010-01-01

    The new paradigm of cloud computing poses severe security risks to its adopters. In order to cope with these risks, appropriate taxonomies and classification criteria for attacks on cloud computing are required. In this work-in-progress paper we present one such taxonomy based on the notion...... of attack surfaces of the cloud computing scenario participants....

  8. Object-based Dimensionality Reduction in Land Surface Phenology Classification

    Directory of Open Access Journals (Sweden)

    Brian E. Bunker

    2016-11-01

    Full Text Available Unsupervised classification or clustering of multi-decadal land surface phenology provides a spatio-temporal synopsis of natural and agricultural vegetation response to environmental variability and anthropogenic activities. Notwithstanding the detailed temporal information available in calibrated bi-monthly normalized difference vegetation index (NDVI and comparable time series, typical pre-classification workflows average a pixel’s bi-monthly index within the larger multi-decadal time series. While this process is one practical way to reduce the dimensionality of time series with many hundreds of image epochs, it effectively dampens temporal variation from both intra and inter-annual observations related to land surface phenology. Through a novel application of object-based segmentation aimed at spatial (not temporal dimensionality reduction, all 294 image epochs from a Moderate Resolution Imaging Spectroradiometer (MODIS bi-monthly NDVI time series covering the northern Fertile Crescent were retained (in homogenous landscape units as unsupervised classification inputs. Given the inherent challenges of in situ or manual image interpretation of land surface phenology classes, a cluster validation approach based on transformed divergence enabled comparison between traditional and novel techniques. Improved intra-annual contrast was clearly manifest in rain-fed agriculture and inter-annual trajectories showed increased cluster cohesion, reducing the overall number of classes identified in the Fertile Crescent study area from 24 to 10. Given careful segmentation parameters, this spatial dimensionality reduction technique augments the value of unsupervised learning to generate homogeneous land surface phenology units. By combining recent scalable computational approaches to image segmentation, future work can pursue new global land surface phenology products based on the high temporal resolution signatures of vegetation index time series.

  9. The Design of Case Products’ Shape Form Information Database Based on NURBS Surface

    Science.gov (United States)

    Liu, Xing; Liu, Guo-zhong; Xu, Nuo-qi; Zhang, Wei-she

    2017-07-01

    In order to improve the computer design of product shape design,applying the Non-uniform Rational B-splines(NURBS) of curves and surfaces surface to the representation of the product shape helps designers to design the product effectively.On the basis of the typical product image contour extraction and using Pro/Engineer(Pro/E) to extract the geometric feature of scanning mold,in order to structure the information data base system of value point,control point and node vector parameter information,this paper put forward a unified expression method of using NURBS curves and surfaces to describe products’ geometric shape and using matrix laboratory(MATLAB) to simulate when products have the same or similar function.A case study of electric vehicle’s front cover illustrates the access process of geometric shape information of case product in this paper.This method can not only greatly reduce the capacity of information debate,but also improve the effectiveness of computer aided geometric innovation modeling.

  10. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  11. Modeling of edge effect in subaperture tool influence functions of computer controlled optical surfacing.

    Science.gov (United States)

    Wan, Songlin; Zhang, Xiangchao; He, Xiaoying; Xu, Min

    2016-12-20

    Computer controlled optical surfacing requires an accurate tool influence function (TIF) for reliable path planning and deterministic fabrication. Near the edge of the workpieces, the TIF has a nonlinear removal behavior, which will cause a severe edge-roll phenomenon. In the present paper, a new edge pressure model is developed based on the finite element analysis results. The model is represented as the product of a basic pressure function and a correcting function. The basic pressure distribution is calculated according to the surface shape of the polishing pad, and the correcting function is used to compensate the errors caused by the edge effect. Practical experimental results demonstrate that the new model can accurately predict the edge TIFs with different overhang ratios. The relative error of the new edge model can be reduced to 15%.

  12. Computer-Assisted Search Of Large Textual Data Bases

    Science.gov (United States)

    Driscoll, James R.

    1995-01-01

    "QA" denotes high-speed computer system for searching diverse collections of documents including (but not limited to) technical reference manuals, legal documents, medical documents, news releases, and patents. Incorporates previously available and emerging information-retrieval technology to help user intelligently and rapidly locate information found in large textual data bases. Technology includes provision for inquiries in natural language; statistical ranking of retrieved information; artificial-intelligence implementation of semantics, in which "surface level" knowledge found in text used to improve ranking of retrieved information; and relevance feedback, in which user's judgements of relevance of some retrieved documents used automatically to modify search for further information.

  13. A density gradient theory based method for surface tension calculations

    DEFF Research Database (Denmark)

    Liang, Xiaodong; Michelsen, Michael Locht; Kontogeorgis, Georgios

    2016-01-01

    The density gradient theory has been becoming a widely used framework for calculating surface tension, within which the same equation of state is used for the interface and bulk phases, because it is a theoretically sound, consistent and computationally affordable approach. Based on the observation...... that the optimal density path from the geometric mean density gradient theory passes the saddle point of the tangent plane distance to the bulk phases, we propose to estimate surface tension with an approximate density path profile that goes through this saddle point. The linear density gradient theory, which...... assumes linearly distributed densities between the two bulk phases, has also been investigated. Numerical problems do not occur with these density path profiles. These two approximation methods together with the full density gradient theory have been used to calculate the surface tension of various...

  14. Surface stress-based biosensors.

    Science.gov (United States)

    Sang, Shengbo; Zhao, Yuan; Zhang, Wendong; Li, Pengwei; Hu, Jie; Li, Gang

    2014-01-15

    Surface stress-based biosensors, as one kind of label-free biosensors, have attracted lots of attention in the process of information gathering and measurement for the biological, chemical and medical application with the development of technology and society. This kind of biosensors offers many advantages such as short response time (less than milliseconds) and a typical sensitivity at nanogram, picoliter, femtojoule and attomolar level. Furthermore, it simplifies sample preparation and testing procedures. In this work, progress made towards the use of surface stress-based biosensors for achieving better performance is critically reviewed, including our recent achievement, the optimally circular membrane-based biosensors and biosensor array. The further scientific and technological challenges in this field are also summarized. Critical remark and future steps towards the ultimate surface stress-based biosensors are addressed. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Computer simulation of biomolecule–biomaterial interactions at surfaces and interfaces

    International Nuclear Information System (INIS)

    Wang, Qun; Wang, Meng-hao; Lu, Xiong; Wang, Ke-feng; Zhang, Xing-dong; Liu, Yaling; Zhang, Hong-ping

    2015-01-01

    Biomaterial surfaces and interfaces are intrinsically complicated systems because they involve biomolecules, implanted biomaterials, and complex biological environments. It is difficult to understand the interaction mechanism between biomaterials and biomolecules through conventional experimental methods. Computer simulation is an effective way to study the interaction mechanism at the atomic and molecular levels. In this review, we summarized the recent studies on the interaction behaviors of biomolecules with three types of the most widely used biomaterials: hydroxyapatite (HA), titanium oxide (TiO 2 ), and graphene(G)/graphene oxide(GO). The effects of crystal forms, crystallographic planes, surface defects, doping atoms, and water environments on biomolecules adsorption are discussed in detail. This review provides valuable theoretical guidance for biomaterial designing and surface modification. (topical review)

  16. Gaussian process based intelligent sampling for measuring nano-structure surfaces

    Science.gov (United States)

    Sun, L. J.; Ren, M. J.; Yin, Y. H.

    2016-09-01

    Nanotechnology is the science and engineering that manipulate matters at nano scale, which can be used to create many new materials and devices with a vast range of applications. As the nanotech product increasingly enters the commercial marketplace, nanometrology becomes a stringent and enabling technology for the manipulation and the quality control of the nanotechnology. However, many measuring instruments, for instance scanning probe microscopy, are limited to relatively small area of hundreds of micrometers with very low efficiency. Therefore some intelligent sampling strategies should be required to improve the scanning efficiency for measuring large area. This paper presents a Gaussian process based intelligent sampling method to address this problem. The method makes use of Gaussian process based Bayesian regression as a mathematical foundation to represent the surface geometry, and the posterior estimation of Gaussian process is computed by combining the prior probability distribution with the maximum likelihood function. Then each sampling point is adaptively selected by determining the position which is the most likely outside of the required tolerance zone among the candidates and then inserted to update the model iteratively. Both simulationson the nominal surface and manufactured surface have been conducted on nano-structure surfaces to verify the validity of the proposed method. The results imply that the proposed method significantly improves the measurement efficiency in measuring large area structured surfaces.

  17. Raster Scan Computer Image Generation (CIG) System Based On Refresh Memory

    Science.gov (United States)

    Dichter, W.; Doris, K.; Conkling, C.

    1982-06-01

    A full color, Computer Image Generation (CIG) raster visual system has been developed which provides a high level of training sophistication by utilizing advanced semiconductor technology and innovative hardware and firmware techniques. Double buffered refresh memory and efficient algorithms eliminate the problem of conventional raster line ordering by allowing the generated image to be stored in a random fashion. Modular design techniques and simplified architecture provide significant advantages in reduced system cost, standardization of parts, and high reliability. The major system components are a general purpose computer to perform interfacing and data base functions; a geometric processor to define the instantaneous scene image; a display generator to convert the image to a video signal; an illumination control unit which provides final image processing; and a CRT monitor for display of the completed image. Additional optional enhancements include texture generators, increased edge and occultation capability, curved surface shading, and data base extensions.

  18. Efficient computer program EPAS-J1 for calculating stress intensity factors of three-dimensional surface cracks

    International Nuclear Information System (INIS)

    Miyazaki, Noriyuki; Watanabe, Takayuki; Yagawa, Genki.

    1982-03-01

    A finite element computer program EPAS-J1 was developed to calculate the stress intensity factors of three-dimensional cracks. In the program, the stress intensity factor is determined by the virtual crack extension method together with the distorted elements allocated along the crack front. This program also includes the connection elements based on the Lagrange multiplier concept to connect such different kinds of elements as the solid and shell elements, or the shell and beam elements. For the structure including three-dimensional surface cracks, the solid elements are employed only at the neighborhood of a surface crack, while the remainder of the structure is modeled by the shell or beam elements due to the reason that the crack singularity is very local. Computer storage and computational time can be highly reduced with the application of the above modeling technique for the calculation of the stress intensity factors of the three-dimensional surface cracks, because the three-dimensional solid elements are required only around the crack front. Several numerical analyses were performed by the EPAS-J1 program. At first, the accuracies of the connection element and the virtual crack extension method were confirmed using the simple structures. Compared with other techniques of connecting different kinds of elements such as the tying method or the method using anisotropic plate element, the present connection element is found to provide better results than the others. It is also found that the virtual crack extension method provides the accurate stress intensity factor. Furthermore, the results are also presented for the stress intensity factor analyses of cylinders with longitudinal or circumferential surface cracks using the combination of the various kinds of elements together with the connection elements. (author)

  19. A computational ab initio study of surface diffusion of sulfur on the CdTe (111) surface

    Energy Technology Data Exchange (ETDEWEB)

    Naderi, Ebadollah, E-mail: enaderi42@gmail.com [Department of Physics, Savitribai Phule Pune University (SPPU), Pune-411007 (India); Ghaisas, S. V. [Department of Electronic Science, Savitribai Phule Pune University (SPPU), Pune-411007 (India)

    2016-08-15

    In order to discern the formation of epitaxial growth of CdS shell over CdTe nanocrystals, kinetics related to the initial stages of the growth of CdS on CdTe is investigated using ab-initio methods. We report diffusion of sulfur adatom on the CdTe (111) A-type (Cd-terminated) and B-type (Te-terminated) surfaces within the density functional theory (DFT). The barriers are computed by applying the climbing Nudge Elastic Band (c-NEB) method. From the results surface hopping emerges as the major mode of diffusion. In addition, there is a distinct contribution from kick-out type diffusion in which a CdTe surface atom is kicked out from its position and is replaced by the diffusing sulfur atom. Also, surface vacancy substitution contributes to the concomitant dynamics. There are sites on the B- type surface that are competitively close in terms of the binding energy to the lowest energy site of epitaxy on the surface. The kick-out process is more likely for B-type surface where a Te atom of the surface is displaced by a sulfur adatom. Further, on the B-type surface, subsurface migration of sulfur is indicated. Furthermore, the binding energies of S on CdTe reveal that on the A-type surface, epitaxial sites provide relatively higher binding energies and barriers than on B-type.

  20. A computational ab initio study of surface diffusion of sulfur on the CdTe (111) surface

    Science.gov (United States)

    Naderi, Ebadollah; Ghaisas, S. V.

    2016-08-01

    In order to discern the formation of epitaxial growth of CdS shell over CdTe nanocrystals, kinetics related to the initial stages of the growth of CdS on CdTe is investigated using ab-initio methods. We report diffusion of sulfur adatom on the CdTe (111) A-type (Cd-terminated) and B-type (Te-terminated) surfaces within the density functional theory (DFT). The barriers are computed by applying the climbing Nudge Elastic Band (c-NEB) method. From the results surface hopping emerges as the major mode of diffusion. In addition, there is a distinct contribution from kick-out type diffusion in which a CdTe surface atom is kicked out from its position and is replaced by the diffusing sulfur atom. Also, surface vacancy substitution contributes to the concomitant dynamics. There are sites on the B- type surface that are competitively close in terms of the binding energy to the lowest energy site of epitaxy on the surface. The kick-out process is more likely for B-type surface where a Te atom of the surface is displaced by a sulfur adatom. Further, on the B-type surface, subsurface migration of sulfur is indicated. Furthermore, the binding energies of S on CdTe reveal that on the A-type surface, epitaxial sites provide relatively higher binding energies and barriers than on B-type.

  1. A computational ab initio study of surface diffusion of sulfur on the CdTe (111) surface

    International Nuclear Information System (INIS)

    Naderi, Ebadollah; Ghaisas, S. V.

    2016-01-01

    In order to discern the formation of epitaxial growth of CdS shell over CdTe nanocrystals, kinetics related to the initial stages of the growth of CdS on CdTe is investigated using ab-initio methods. We report diffusion of sulfur adatom on the CdTe (111) A-type (Cd-terminated) and B-type (Te-terminated) surfaces within the density functional theory (DFT). The barriers are computed by applying the climbing Nudge Elastic Band (c-NEB) method. From the results surface hopping emerges as the major mode of diffusion. In addition, there is a distinct contribution from kick-out type diffusion in which a CdTe surface atom is kicked out from its position and is replaced by the diffusing sulfur atom. Also, surface vacancy substitution contributes to the concomitant dynamics. There are sites on the B- type surface that are competitively close in terms of the binding energy to the lowest energy site of epitaxy on the surface. The kick-out process is more likely for B-type surface where a Te atom of the surface is displaced by a sulfur adatom. Further, on the B-type surface, subsurface migration of sulfur is indicated. Furthermore, the binding energies of S on CdTe reveal that on the A-type surface, epitaxial sites provide relatively higher binding energies and barriers than on B-type.

  2. Pervasive Computing Support for Hospitals: An Overview of the Activity-Based Computing Project

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Bardram, Jakob E

    2007-01-01

    The activity-based computing project researched pervasive computing support for clinical hospital work. Such technologies have potential for supporting the mobile, collaborative, and disruptive use of heterogeneous embedded devices in a hospital......The activity-based computing project researched pervasive computing support for clinical hospital work. Such technologies have potential for supporting the mobile, collaborative, and disruptive use of heterogeneous embedded devices in a hospital...

  3. STM investigation of epitaxial Si growth for the fabrication of a Si-based quantum computer

    Energy Technology Data Exchange (ETDEWEB)

    Oberbeck, Lars; Hallam, Toby; Curson, Neil J.; Simmons, Michelle Y.; Clark, Robert G

    2003-05-15

    We investigate the morphology of epitaxial Si layers grown on clean and on hydrogen terminated Si(0 0 1) to explore the growth strategy for the fabrication of a Si-based quantum computer. We use molecular beam epitaxy to deposit 5 monolayers of silicon at a temperature of 250 deg. C and scanning tunnelling microscopy to image the surface at room temperature after growth and after various rapid annealing steps in the temperature range of 350-600 deg. C. The epitaxial layer grown on the hydrogenated surface shows a significantly higher surface roughness due to a lower mobility of silicon surface atoms in the presence of hydrogen. Annealing at temperatures {>=}550 deg. C reduces the roughness of both epitaxial layers to the value of a clean silicon surface. However, the missing dimer defect density of the epitaxial layer grown on the hydrogenated surface remains higher by a factor of two compared to the layer grown on clean Si(0 0 1). Our results suggest a quantum computer growth strategy in which the hydrogen resist layer is desorbed before the epitaxial silicon layer is grown at low temperature to encapsulate phosphorus quantum bits.

  4. Reheating breakfast: Age and multitasking on a computer-based and a non-computer-based task

    OpenAIRE

    Feinkohl, I.; Cress, U.; Kimmerle, J.

    2016-01-01

    Computer-based assessments are popular means to measure individual differences, including age differences, in cognitive ability, but are rarely tested for the extent to which they correspond to more realistic behavior. In the present study, we explored the extent to which performance on an existing computer-based task of multitasking ('cooking breakfast') may be generalizable by comparing it with a newly developed version of the same task that required interaction with physical objects. Twent...

  5. Multivariate tensor-based morphometry on surfaces: application to mapping ventricular abnormalities in HIV/AIDS.

    Science.gov (United States)

    Wang, Yalin; Zhang, Jie; Gutman, Boris; Chan, Tony F; Becker, James T; Aizenstein, Howard J; Lopez, Oscar L; Tamburo, Robert J; Toga, Arthur W; Thompson, Paul M

    2010-02-01

    Here we developed a new method, called multivariate tensor-based surface morphometry (TBM), and applied it to study lateral ventricular surface differences associated with HIV/AIDS. Using concepts from differential geometry and the theory of differential forms, we created mathematical structures known as holomorphic one-forms, to obtain an efficient and accurate conformal parameterization of the lateral ventricular surfaces in the brain. The new meshing approach also provides a natural way to register anatomical surfaces across subjects, and improves on prior methods as it handles surfaces that branch and join at complex 3D junctions. To analyze anatomical differences, we computed new statistics from the Riemannian surface metrics-these retain multivariate information on local surface geometry. We applied this framework to analyze lateral ventricular surface morphometry in 3D MRI data from 11 subjects with HIV/AIDS and 8 healthy controls. Our method detected a 3D profile of surface abnormalities even in this small sample. Multivariate statistics on the local tensors gave better effect sizes for detecting group differences, relative to other TBM-based methods including analysis of the Jacobian determinant, the largest and smallest eigenvalues of the surface metric, and the pair of eigenvalues of the Jacobian matrix. The resulting analysis pipeline may improve the power of surface-based morphometry studies of the brain. Copyright (c) 2009 Elsevier Inc. All rights reserved.

  6. Development of computational technique for labeling magnetic flux-surfaces

    International Nuclear Information System (INIS)

    Nunami, Masanori; Kanno, Ryutaro; Satake, Shinsuke; Hayashi, Takaya; Takamaru, Hisanori

    2006-03-01

    In recent Large Helical Device (LHD) experiments, radial profiles of ion temperature, electric field, etc. are measured in the m/n=1/1 magnetic island produced by island control coils, where m is the poloidal mode number and n the toroidal mode number. When the transport of the plasma in the radial profiles is numerically analyzed, an average over a magnetic flux-surface in the island is a very useful concept to understand the transport. On averaging, a proper labeling of the flux-surfaces is necessary. In general, it is not easy to label the flux-surfaces in the magnetic field with the island, compared with the case of a magnetic field configuration having nested flux-surfaces. In the present paper, we have developed a new computational technique to label the magnetic flux-surfaces. This technique is constructed by using an optimization algorithm, which is known as an optimization method called the simulated annealing method. The flux-surfaces are discerned by using two labels: one is classification of the magnetic field structure, i.e., core, island, ergodic, and outside regions, and the other is a value of the toroidal magnetic flux. We have applied the technique to an LHD configuration with the m/n=1/1 island, and successfully obtained the discrimination of the magnetic field structure. (author)

  7. Mobile computing device configured to compute irradiance, glint, and glare of the sun

    Science.gov (United States)

    Gupta, Vipin P; Ho, Clifford K; Khalsa, Siri Sahib

    2014-03-11

    Described herein are technologies pertaining to computing the solar irradiance distribution on a surface of a receiver in a concentrating solar power system or glint/glare emitted from a reflective entity. A mobile computing device includes at least one camera that captures images of the Sun and the entity of interest, wherein the images have pluralities of pixels having respective pluralities of intensity values. Based upon the intensity values of the pixels in the respective images, the solar irradiance distribution on the surface of the entity or glint/glare corresponding to the entity is computed by the mobile computing device.

  8. Evaluation of computer-based ultrasonic inservice inspection systems

    International Nuclear Information System (INIS)

    Harris, R.V. Jr.; Angel, L.J.; Doctor, S.R.; Park, W.R.; Schuster, G.J.; Taylor, T.T.

    1994-03-01

    This report presents the principles, practices, terminology, and technology of computer-based ultrasonic testing for inservice inspection (UT/ISI) of nuclear power plants, with extensive use of drawings, diagrams, and LTT images. The presentation is technical but assumes limited specific knowledge of ultrasonics or computers. The report is divided into 9 sections covering conventional LTT, computer-based LTT, and evaluation methodology. Conventional LTT topics include coordinate axes, scanning, instrument operation, RF and video signals, and A-, B-, and C-scans. Computer-based topics include sampling, digitization, signal analysis, image presentation, SAFI, ultrasonic holography, transducer arrays, and data interpretation. An evaluation methodology for computer-based LTT/ISI systems is presented, including questions, detailed procedures, and test block designs. Brief evaluations of several computer-based LTT/ISI systems are given; supplementary volumes will provide detailed evaluations of selected systems

  9. A 3-D Approach for Teaching and Learning about Surface Water Systems through Computational Thinking, Data Visualization and Physical Models

    Science.gov (United States)

    Caplan, B.; Morrison, A.; Moore, J. C.; Berkowitz, A. R.

    2017-12-01

    Understanding water is central to understanding environmental challenges. Scientists use `big data' and computational models to develop knowledge about the structure and function of complex systems, and to make predictions about changes in climate, weather, hydrology, and ecology. Large environmental systems-related data sets and simulation models are difficult for high school teachers and students to access and make sense of. Comp Hydro, a collaboration across four states and multiple school districts, integrates computational thinking and data-related science practices into water systems instruction to enhance development of scientific model-based reasoning, through curriculum, assessment and teacher professional development. Comp Hydro addresses the need for 1) teaching materials for using data and physical models of hydrological phenomena, 2) building teachers' and students' comfort or familiarity with data analysis and modeling, and 3) infusing the computational knowledge and practices necessary to model and visualize hydrologic processes into instruction. Comp Hydro teams in Baltimore, MD and Fort Collins, CO are integrating teaching about surface water systems into high school courses focusing on flooding (MD) and surface water reservoirs (CO). This interactive session will highlight the successes and challenges of our physical and simulation models in helping teachers and students develop proficiency with computational thinking about surface water. We also will share insights from comparing teacher-led vs. project-led development of curriculum and our simulations.

  10. Learning-based computing techniques in geoid modeling for precise height transformation

    Science.gov (United States)

    Erol, B.; Erol, S.

    2013-03-01

    Precise determination of local geoid is of particular importance for establishing height control in geodetic GNSS applications, since the classical leveling technique is too laborious. A geoid model can be accurately obtained employing properly distributed benchmarks having GNSS and leveling observations using an appropriate computing algorithm. Besides the classical multivariable polynomial regression equations (MPRE), this study attempts an evaluation of learning based computing algorithms: artificial neural networks (ANNs), adaptive network-based fuzzy inference system (ANFIS) and especially the wavelet neural networks (WNNs) approach in geoid surface approximation. These algorithms were developed parallel to advances in computer technologies and recently have been used for solving complex nonlinear problems of many applications. However, they are rather new in dealing with precise modeling problem of the Earth gravity field. In the scope of the study, these methods were applied to Istanbul GPS Triangulation Network data. The performances of the methods were assessed considering the validation results of the geoid models at the observation points. In conclusion the ANFIS and WNN revealed higher prediction accuracies compared to ANN and MPRE methods. Beside the prediction capabilities, these methods were also compared and discussed from the practical point of view in conclusions.

  11. Property-Based Anonymous Attestation in Trusted Cloud Computing

    Directory of Open Access Journals (Sweden)

    Zhen-Hu Ning

    2014-01-01

    Full Text Available In the remote attestation on Trusted Computer (TC computing mode TCCP, the trusted computer TC has an excessive burden, and anonymity and platform configuration information security of computing nodes cannot be guaranteed. To overcome these defects, based on the research on and analysis of current schemes, we propose an anonymous proof protocol based on property certificate. The platform configuration information is converted by the matrix algorithm into the property certificate, and the remote attestation is implemented by trusted ring signature scheme based on Strong RSA Assumption. By the trusted ring signature scheme based on property certificate, we achieve the anonymity of computing nodes and prevent the leakage of platform configuration information. By simulation, we obtain the computational efficiency of the scheme. We also expand the protocol and obtain the anonymous attestation based on ECC. By scenario comparison, we obtain the trusted ring signature scheme based on RSA, which has advantages with the growth of the ring numbers.

  12. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...

  13. A computer program for fitting smooth surfaces to an aircraft configuration and other three dimensional geometries

    Science.gov (United States)

    Craidon, C. B.

    1975-01-01

    A computer program that uses a three-dimensional geometric technique for fitting a smooth surface to the component parts of an aircraft configuration is presented. The resulting surface equations are useful in performing various kinds of calculations in which a three-dimensional mathematical description is necessary. Programs options may be used to compute information for three-view and orthographic projections of the configuration as well as cross-section plots at any orientation through the configuration. The aircraft geometry input section of the program may be easily replaced with a surface point description in a different form so that the program could be of use for any three-dimensional surface equations.

  14. Self-guaranteed measurement-based quantum computation

    Science.gov (United States)

    Hayashi, Masahito; Hajdušek, Michal

    2018-05-01

    In order to guarantee the output of a quantum computation, we usually assume that the component devices are trusted. However, when the total computation process is large, it is not easy to guarantee the whole system when we have scaling effects, unexpected noise, or unaccounted for correlations between several subsystems. If we do not trust the measurement basis or the prepared entangled state, we do need to be worried about such uncertainties. To this end, we propose a self-guaranteed protocol for verification of quantum computation under the scheme of measurement-based quantum computation where no prior-trusted devices (measurement basis or entangled state) are needed. The approach we present enables the implementation of verifiable quantum computation using the measurement-based model in the context of a particular instance of delegated quantum computation where the server prepares the initial computational resource and sends it to the client, who drives the computation by single-qubit measurements. Applying self-testing procedures, we are able to verify the initial resource as well as the operation of the quantum devices and hence the computation itself. The overhead of our protocol scales with the size of the initial resource state to the power of 4 times the natural logarithm of the initial state's size.

  15. An Overview of Computer-Based Natural Language Processing.

    Science.gov (United States)

    Gevarter, William B.

    Computer-based Natural Language Processing (NLP) is the key to enabling humans and their computer-based creations to interact with machines using natural languages (English, Japanese, German, etc.) rather than formal computer languages. NLP is a major research area in the fields of artificial intelligence and computational linguistics. Commercial…

  16. The determination of surface of powders by BET method using nitrogen and krypton with computer calculation of the results

    International Nuclear Information System (INIS)

    Dembinski, W.; Zlotowski, T.

    1973-01-01

    A computer program written in FORTRAN language for calculations of final results of specific surface analysis based on BET theory has been described. Two gases - nitrogen and krypton were used. A technical description of measuring apparaturs is presented as well as theoretical basis of the calculations together with statistical analysis of the results for uranium compounds powders. (author)

  17. Theory of surface enrichment in disordered monophasic binary alloys. Numerical computations for Ag-Au alloys

    NARCIS (Netherlands)

    Santen, van R.A.; Boersma, M.A.M.

    1974-01-01

    The regular solution model is used to compute the surface enrichment in the (111)- and (100)-faces of silver-gold alloys. Surface enrichment by silver is predicted to increase if the surface plane becomes less saturated and decreases if one raises the temperature. The possible implications of these

  18. A Compute Environment of ABC95 Array Computer Based on Multi-FPGA Chip

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    ABC95 array computer is a multi-function network's computer based on FPGA technology, The multi-function network supports processors conflict-free access data from memory and supports processors access data from processors based on enhanced MESH network.ABC95 instruction's system includes control instructions, scalar instructions, vectors instructions.Mostly net-work instructions are introduced.A programming environment of ABC95 array computer assemble language is designed.A programming environment of ABC95 array computer for VC++ is advanced.It includes load function of ABC95 array computer program and data, store function, run function and so on.Specially, The data type of ABC95 array computer conflict-free access is defined.The results show that these technologies can develop programmer of ABC95 array computer effectively.

  19. Computational methods for investigation of surface curvature effects on airfoil boundary layer behavior

    Directory of Open Access Journals (Sweden)

    Xiang Shen

    2017-03-01

    Full Text Available This article presents computational algorithms for the design, analysis, and optimization of airfoil aerodynamic performance. The prescribed surface curvature distribution blade design (CIRCLE method is applied to a symmetrical airfoil NACA0012 and a non-symmetrical airfoil E387 to remove their surface curvature and slope-of-curvature discontinuities. Computational fluid dynamics analysis is used to investigate the effects of curvature distribution on aerodynamic performance of the original and modified airfoils. An inviscid–viscid interaction scheme is introduced to predict the positions of laminar separation bubbles. The results are compared with experimental data obtained from tests on the original airfoil geometry. The computed aerodynamic advantages of the modified airfoils are analyzed in different operating conditions. The leading edge singularity of NACA0012 is removed and it is shown that the surface curvature discontinuity affects aerodynamic performance near the stalling angle of attack. The discontinuous slope-of-curvature distribution of E387 results in a larger laminar separation bubble at lower angles of attack and lower Reynolds numbers. It also affects the inherent performance of the airfoil at higher Reynolds numbers. It is shown that at relatively high angles of attack, a continuous slope-of-curvature distribution reduces the skin friction by suppressing both laminar and turbulent separation, and by delaying laminar-turbulent transition. It is concluded that the surface curvature distribution has significant effects on the boundary layer behavior and consequently an improved curvature distribution will lead to higher aerodynamic efficiency.

  20. Transforming bases to bytes: Molecular computing with DNA

    Indian Academy of Sciences (India)

    Despite the popular image of silicon-based computers for computation, an embryonic field of mole- cular computation is emerging, where molecules in solution perform computational ..... [4] Mao C, Sun W, Shen Z and Seeman N C 1999. A nanomechanical device based on the B-Z transition of DNA; Nature 397 144–146.

  1. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  2. Computational design of surfaces, nanostructures and optoelectronic materials

    Science.gov (United States)

    Choudhary, Kamal

    Properties of engineering materials are generally influenced by defects such as point defects (vacancies, interstitials, substitutional defects), line defects (dislocations), planar defects (grain boundaries, free surfaces/nanostructures, interfaces, stacking faults) and volume defects (voids). Classical physics based molecular dynamics and quantum physics based density functional theory can be useful in designing materials with controlled defect properties. In this thesis, empirical potential based molecular dynamics was used to study the surface modification of polymers due to energetic polyatomic ion, thermodynamics and mechanics of metal-ceramic interfaces and nanostructures, while density functional theory was used to screen substituents in optoelectronic materials. Firstly, polyatomic ion-beams were deposited on polymer surfaces and the resulting chemical modifications of the surface were examined. In particular, S, SC and SH were deposited on amorphous polystyrene (PS), and C2H, CH3, and C3H5 were deposited on amorphous poly (methyl methacrylate) (PMMA) using molecular dynamics simulations with classical reactive empirical many-body (REBO) potentials. The objective of this work was to elucidate the mechanisms by which the polymer surface modification took place. The results of the work could be used in tailoring the incident energy and/or constituents of ion beam for obtaining a particular chemistry inside the polymer surface. Secondly, a new Al-O-N empirical potential was developed within the charge optimized many body (COMB) formalism. This potential was then used to examine the thermodynamic stability of interfaces and mechanical properties of nanostructures composed of aluminum, its oxide and its nitride. The potentials were tested for these materials based on surface energies, defect energies, bulk phase stability, the mechanical properties of the most stable bulk phase, its phonon properties as well as with a genetic algorithm based evolution theory of

  3. Computer-Based Learning in Chemistry Classes

    Science.gov (United States)

    Pietzner, Verena

    2014-01-01

    Currently not many people would doubt that computers play an essential role in both public and private life in many countries. However, somewhat surprisingly, evidence of computer use is difficult to find in German state schools although other countries have managed to implement computer-based teaching and learning in their schools. This paper…

  4. Music Learning Based on Computer Software

    Directory of Open Access Journals (Sweden)

    Baihui Yan

    2017-12-01

    Full Text Available In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teachers have not found a reasonable countermeasure to them. Against this background, the introduction of computer music software to music learning is a new trial that can not only cultivate the students’ initiatives of music learning, but also enhance their abilities to learn music. Therefore, it is concluded that the computer software based music learning is of great significance to improving the current music learning modes and means.

  5. Novel Schemes for Measurement-Based Quantum Computation

    International Nuclear Information System (INIS)

    Gross, D.; Eisert, J.

    2007-01-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics--based on finitely correlated or projected entangled pair states--to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems

  6. Novel schemes for measurement-based quantum computation.

    Science.gov (United States)

    Gross, D; Eisert, J

    2007-06-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics-based on finitely correlated or projected entangled pair states-to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems.

  7. Virtual Ligand Screening Using PL-PatchSurfer2, a Molecular Surface-Based Protein-Ligand Docking Method.

    Science.gov (United States)

    Shin, Woong-Hee; Kihara, Daisuke

    2018-01-01

    Virtual screening is a computational technique for predicting a potent binding compound for a receptor protein from a ligand library. It has been a widely used in the drug discovery field to reduce the efforts of medicinal chemists to find hit compounds by experiments.Here, we introduce our novel structure-based virtual screening program, PL-PatchSurfer, which uses molecular surface representation with the three-dimensional Zernike descriptors, which is an effective mathematical representation for identifying physicochemical complementarities between local surfaces of a target protein and a ligand. The advantage of the surface-patch description is its tolerance on a receptor and compound structure variation. PL-PatchSurfer2 achieves higher accuracy on apo form and computationally modeled receptor structures than conventional structure-based virtual screening programs. Thus, PL-PatchSurfer2 opens up an opportunity for targets that do not have their crystal structures. The program is provided as a stand-alone program at http://kiharalab.org/plps2 . We also provide files for two ligand libraries, ChEMBL and ZINC Drug-like.

  8. Challenges and opportunities of modeling plasma–surface interactions in tungsten using high-performance computing

    Energy Technology Data Exchange (ETDEWEB)

    Wirth, Brian D., E-mail: bdwirth@utk.edu [Department of Nuclear Engineering, University of Tennessee, Knoxville, TN 37996 (United States); Nuclear Science and Engineering Directorate, Oak Ridge National Laboratory, Oak Ridge, TN (United States); Hammond, K.D. [Department of Nuclear Engineering, University of Tennessee, Knoxville, TN 37996 (United States); Krasheninnikov, S.I. [University of California, San Diego, La Jolla, CA (United States); Maroudas, D. [University of Massachusetts, Amherst, Amherst, MA 01003 (United States)

    2015-08-15

    The performance of plasma facing components (PFCs) is critical for ITER and future magnetic fusion reactors. The ITER divertor will be tungsten, which is the primary candidate material for future reactors. Recent experiments involving tungsten exposure to low-energy helium plasmas reveal significant surface modification, including the growth of nanometer-scale tendrils of “fuzz” and formation of nanometer-sized bubbles in the near-surface region. The large span of spatial and temporal scales governing plasma surface interactions are among the challenges to modeling divertor performance. Fortunately, recent innovations in computational modeling, increasingly powerful high-performance computers, and improved experimental characterization tools provide a path toward self-consistent, experimentally validated models of PFC and divertor performance. Recent advances in understanding tungsten–helium interactions are reviewed, including such processes as helium clustering, which serve as nuclei for gas bubbles; and trap mutation, dislocation loop punching and bubble bursting; which together initiate surface morphological modification.

  9. Challenges and opportunities of modeling plasma–surface interactions in tungsten using high-performance computing

    International Nuclear Information System (INIS)

    Wirth, Brian D.; Hammond, K.D.; Krasheninnikov, S.I.; Maroudas, D.

    2015-01-01

    The performance of plasma facing components (PFCs) is critical for ITER and future magnetic fusion reactors. The ITER divertor will be tungsten, which is the primary candidate material for future reactors. Recent experiments involving tungsten exposure to low-energy helium plasmas reveal significant surface modification, including the growth of nanometer-scale tendrils of “fuzz” and formation of nanometer-sized bubbles in the near-surface region. The large span of spatial and temporal scales governing plasma surface interactions are among the challenges to modeling divertor performance. Fortunately, recent innovations in computational modeling, increasingly powerful high-performance computers, and improved experimental characterization tools provide a path toward self-consistent, experimentally validated models of PFC and divertor performance. Recent advances in understanding tungsten–helium interactions are reviewed, including such processes as helium clustering, which serve as nuclei for gas bubbles; and trap mutation, dislocation loop punching and bubble bursting; which together initiate surface morphological modification

  10. Modeling soft factors in computer-based wargames

    Science.gov (United States)

    Alexander, Steven M.; Ross, David O.; Vinarskai, Jonathan S.; Farr, Steven D.

    2002-07-01

    Computer-based wargames have seen much improvement in recent years due to rapid increases in computing power. Because these games have been developed for the entertainment industry, most of these advances have centered on the graphics, sound, and user interfaces integrated into these wargames with less attention paid to the game's fidelity. However, for a wargame to be useful to the military, it must closely approximate as many of the elements of war as possible. Among the elements that are typically not modeled or are poorly modeled in nearly all military computer-based wargames are systematic effects, command and control, intelligence, morale, training, and other human and political factors. These aspects of war, with the possible exception of systematic effects, are individually modeled quite well in many board-based commercial wargames. The work described in this paper focuses on incorporating these elements from the board-based games into a computer-based wargame. This paper will also address the modeling and simulation of the systemic paralysis of an adversary that is implied by the concept of Effects Based Operations (EBO). Combining the fidelity of current commercial board wargames with the speed, ease of use, and advanced visualization of the computer can significantly improve the effectiveness of military decision making and education. Once in place, the process of converting board wargames concepts to computer wargames will allow the infusion of soft factors into military training and planning.

  11. Finite Elements on Point Based Surfaces

    NARCIS (Netherlands)

    Clarenz, U.; Rumpf, M.; Telea, A.

    2004-01-01

    We present a framework for processing point-based surfaces via partial differential equations (PDEs). Our framework efficiently and effectively brings well-known PDE-based processing techniques to the field of point-based surfaces. Our method is based on the construction of local tangent planes and

  12. Feature Surfaces in Symmetric Tensor Fields Based on Eigenvalue Manifold.

    Science.gov (United States)

    Palacios, Jonathan; Yeh, Harry; Wang, Wenping; Zhang, Yue; Laramee, Robert S; Sharma, Ritesh; Schultz, Thomas; Zhang, Eugene

    2016-03-01

    Three-dimensional symmetric tensor fields have a wide range of applications in solid and fluid mechanics. Recent advances in the (topological) analysis of 3D symmetric tensor fields focus on degenerate tensors which form curves. In this paper, we introduce a number of feature surfaces, such as neutral surfaces and traceless surfaces, into tensor field analysis, based on the notion of eigenvalue manifold. Neutral surfaces are the boundary between linear tensors and planar tensors, and the traceless surfaces are the boundary between tensors of positive traces and those of negative traces. Degenerate curves, neutral surfaces, and traceless surfaces together form a partition of the eigenvalue manifold, which provides a more complete tensor field analysis than degenerate curves alone. We also extract and visualize the isosurfaces of tensor modes, tensor isotropy, and tensor magnitude, which we have found useful for domain applications in fluid and solid mechanics. Extracting neutral and traceless surfaces using the Marching Tetrahedra method can cause the loss of geometric and topological details, which can lead to false physical interpretation. To robustly extract neutral surfaces and traceless surfaces, we develop a polynomial description of them which enables us to borrow techniques from algebraic surface extraction, a topic well-researched by the computer-aided design (CAD) community as well as the algebraic geometry community. In addition, we adapt the surface extraction technique, called A-patches, to improve the speed of finding degenerate curves. Finally, we apply our analysis to data from solid and fluid mechanics as well as scalar field analysis.

  13. Semantic computing and language knowledge bases

    Science.gov (United States)

    Wang, Lei; Wang, Houfeng; Yu, Shiwen

    2017-09-01

    As the proposition of the next-generation Web - semantic Web, semantic computing has been drawing more and more attention within the circle and the industries. A lot of research has been conducted on the theory and methodology of the subject, and potential applications have also been investigated and proposed in many fields. The progress of semantic computing made so far cannot be detached from its supporting pivot - language resources, for instance, language knowledge bases. This paper proposes three perspectives of semantic computing from a macro view and describes the current status of affairs about the construction of language knowledge bases and the related research and applications that have been carried out on the basis of these resources via a case study in the Institute of Computational Linguistics at Peking University.

  14. Advanced construction management for lunar base construction - Surface operations planner

    Science.gov (United States)

    Kehoe, Robert P.

    1992-01-01

    The study proposes a conceptual solution and lays the framework for developing a new, sophisticated and intelligent tool for a lunar base construction crew to use. This concept integrates expert systems for critical decision making, virtual reality for training, logistics and laydown optimization, automated productivity measurements, and an advanced scheduling tool to form a unique new planning tool. The concept features extensive use of computers and expert systems software to support the actual work, while allowing the crew to control the project from the lunar surface. Consideration is given to a logistics data base, laydown area management, flexible critical progress scheduler, video simulation of assembly tasks, and assembly information and tracking documentation.

  15. Computer Assisted Instructional Design for Computer-Based Instruction. Final Report. Working Papers.

    Science.gov (United States)

    Russell, Daniel M.; Pirolli, Peter

    Recent advances in artificial intelligence and the cognitive sciences have made it possible to develop successful intelligent computer-aided instructional systems for technical and scientific training. In addition, computer-aided design (CAD) environments that support the rapid development of such computer-based instruction have also been recently…

  16. Ocular Surface and Tear Film Changes in Older Women Working with Computers

    Directory of Open Access Journals (Sweden)

    Alfredo Ribelles

    2015-01-01

    Full Text Available The aim of this work is to investigate changes in the ocular surface (OS and tear film (TF by means of questionnaire-based subjective symptoms, TF break-up time, Schirmer test, and TF analysis in women working with computers and to analyze the effects of the oral supplementation with antioxidants/omega 3 fatty acids (A/ω3 in the OS outcomes. Women aged 40–65 years (n=148 were recruited at the Administrative Offices of Valencia (Spain and distributed into two age groups, 40–52 years (AGE1; n=87 and 53–65 years (AGE2; n=61, and then subdivided according to being (or not computer users (CUG; NCUG during the workday. Homogeneous subgroups were randomly assigned (or not to the daily intake of three pills of A/ω3 for three months. At baseline and at the end of follow-up, personalized interviews and ocular examination were done. Reflex tear samples were collected from the inferior meniscus and processed for a multiplexed particle-based flow cytometry assay to measure proinflammatory molecules. Statistics were performed using the SPSS 15.0 program. The OS pathology was clinically evident in the AGE1-CUG (33% versus the AGE2-CUG (64% of women. Significantly higher interleukins-1β and -6 tear levels were found in the AGE1 versus the AGE2 women employees (P=0.006 and P=0.001, resp., as well as in the CUG versus the NCUG (P=0.001 and P=0.000, resp.. Supplementation with A/ω3 positively influenced the OS pathology as manifested by the amelioration of the clinical signs/symptoms related to computer uses. Strategies involving a safe environment and oral micronutrient supplements may be managed within eye-care standards in older women.

  17. Computer vision-based apple grading for golden delicious apples based on surface features

    Directory of Open Access Journals (Sweden)

    Payman Moallem

    2017-03-01

    Full Text Available In this paper, a computer vision-based algorithm for golden delicious apple grading is proposed which works in six steps. Non-apple pixels as background are firstly removed from input images. Then, stem end is detected by combination of morphological methods and Mahalanobis distant classifier. Calyx region is also detected by applying K-means clustering on the Cb component in YCbCr color space. After that, defects segmentation is achieved using Multi-Layer Perceptron (MLP neural network. In the next step, stem end and calyx regions are removed from defected regions to refine and improve apple grading process. Then, statistical, textural and geometric features from refined defected regions are extracted. Finally, for apple grading, a comparison between performance of Support Vector Machine (SVM, MLP and K-Nearest Neighbor (KNN classifiers is done. Classification is done in two manners which in the first one, an input apple is classified into two categories of healthy and defected. In the second manner, the input apple is classified into three categories of first rank, second rank and rejected ones. In both grading steps, SVM classifier works as the best one with recognition rate of 92.5% and 89.2% for two categories (healthy and defected and three quality categories (first rank, second rank and rejected ones, among 120 different golden delicious apple images, respectively, considering K-folding with K = 5. Moreover, the accuracy of the proposed segmentation algorithms including stem end detection and calyx detection are evaluated for two different apple image databases.

  18. An Applet-based Anonymous Distributed Computing System.

    Science.gov (United States)

    Finkel, David; Wills, Craig E.; Ciaraldi, Michael J.; Amorin, Kevin; Covati, Adam; Lee, Michael

    2001-01-01

    Defines anonymous distributed computing systems and focuses on the specifics of a Java, applet-based approach for large-scale, anonymous, distributed computing on the Internet. Explains the possibility of a large number of computers participating in a single computation and describes a test of the functionality of the system. (Author/LRW)

  19. Fast time-of-flight camera based surface registration for radiotherapy patient positioning.

    Science.gov (United States)

    Placht, Simon; Stancanello, Joseph; Schaller, Christian; Balda, Michael; Angelopoulou, Elli

    2012-01-01

    This work introduces a rigid registration framework for patient positioning in radiotherapy, based on real-time surface acquisition by a time-of-flight (ToF) camera. Dynamic properties of the system are also investigated for future gating/tracking strategies. A novel preregistration algorithm, based on translation and rotation-invariant features representing surface structures, was developed. Using these features, corresponding three-dimensional points were computed in order to determine initial registration parameters. These parameters became a robust input to an accelerated version of the iterative closest point (ICP) algorithm for the fine-tuning of the registration result. Distance calibration and Kalman filtering were used to compensate for ToF-camera dependent noise. Additionally, the advantage of using the feature based preregistration over an "ICP only" strategy was evaluated, as well as the robustness of the rigid-transformation-based method to deformation. The proposed surface registration method was validated using phantom data. A mean target registration error (TRE) for translations and rotations of 1.62 ± 1.08 mm and 0.07° ± 0.05°, respectively, was achieved. There was a temporal delay of about 65 ms in the registration output, which can be seen as negligible considering the dynamics of biological systems. Feature based preregistration allowed for accurate and robust registrations even at very large initial displacements. Deformations affected the accuracy of the results, necessitating particular care in cases of deformed surfaces. The proposed solution is able to solve surface registration problems with an accuracy suitable for radiotherapy cases where external surfaces offer primary or complementary information to patient positioning. The system shows promising dynamic properties for its use in gating/tracking applications. The overall system is competitive with commonly-used surface registration technologies. Its main benefit is the

  20. Fast time-of-flight camera based surface registration for radiotherapy patient positioning

    International Nuclear Information System (INIS)

    Placht, Simon; Stancanello, Joseph; Schaller, Christian; Balda, Michael; Angelopoulou, Elli

    2012-01-01

    Purpose: This work introduces a rigid registration framework for patient positioning in radiotherapy, based on real-time surface acquisition by a time-of-flight (ToF) camera. Dynamic properties of the system are also investigated for future gating/tracking strategies. Methods: A novel preregistration algorithm, based on translation and rotation-invariant features representing surface structures, was developed. Using these features, corresponding three-dimensional points were computed in order to determine initial registration parameters. These parameters became a robust input to an accelerated version of the iterative closest point (ICP) algorithm for the fine-tuning of the registration result. Distance calibration and Kalman filtering were used to compensate for ToF-camera dependent noise. Additionally, the advantage of using the feature based preregistration over an ''ICP only'' strategy was evaluated, as well as the robustness of the rigid-transformation-based method to deformation. Results: The proposed surface registration method was validated using phantom data. A mean target registration error (TRE) for translations and rotations of 1.62 ± 1.08 mm and 0.07 deg. ± 0.05 deg., respectively, was achieved. There was a temporal delay of about 65 ms in the registration output, which can be seen as negligible considering the dynamics of biological systems. Feature based preregistration allowed for accurate and robust registrations even at very large initial displacements. Deformations affected the accuracy of the results, necessitating particular care in cases of deformed surfaces. Conclusions: The proposed solution is able to solve surface registration problems with an accuracy suitable for radiotherapy cases where external surfaces offer primary or complementary information to patient positioning. The system shows promising dynamic properties for its use in gating/tracking applications. The overall system is competitive with commonly-used surface registration

  1. Ammonia-based quantum computer

    International Nuclear Information System (INIS)

    Ferguson, Andrew J.; Cain, Paul A.; Williams, David A.; Briggs, G. Andrew D.

    2002-01-01

    We propose a scheme for quantum computation using two eigenstates of ammonia or similar molecules. Individual ammonia molecules are confined inside fullerenes and used as two-level qubit systems. Interaction between these ammonia qubits takes place via the electric dipole moments, and in particular we show how a controlled-NOT gate could be implemented. After computation the qubit is measured with a single-electron electrometer sensitive enough to differentiate between the dipole moments of different states. We also discuss a possible implementation based on a quantum cellular automaton

  2. Lithography-based addtive manufacture of ceramic biodevices with design-controlled surface topographies

    OpenAIRE

    Blas Romero, Adrián de; Pfaffinger, Markus; Mitteramskogler, Gerald; Schwentenwein, Martin; Jellinek, Christopher; Homa, Johannes; Díaz Lantada, Andrés; Stampfl, Jürgen

    2017-01-01

    The possibility of manufacturing textured materials and devices, with surface properties controlled from the design stage, instead of being the result of machining processes or chemical attacks, is a key factor for the incorporation of advanced functionalities to a wide set of micro- and nanosystems. High-precision additive manufacturing (AM) technologies based on photopolymerization, together with the use of fractal models linked to computer-aided design tools, allow for a precise definit...

  3. Computer simulation of RBS spectra from samples with surface roughness

    Czech Academy of Sciences Publication Activity Database

    Malinský, Petr; Hnatowicz, Vladimír; Macková, Anna

    2016-01-01

    Roč. 371, MAR (2016), s. 101-105 ISSN 0168-583X. [22nd International conference on Ion Beam Analysis (IBA). Opatija, 14.06.2015-19.06.2015] R&D Projects: GA MŠk(CZ) LM2011019; GA ČR GA15-01602S Institutional support: RVO:61389005 Keywords : computer simulation * Rutherford backscattering * surface roughness Subject RIV: BG - Nuclear, Atomic and Molecular Physics, Colliders Impact factor: 1.109, year: 2016

  4. Computer-based feedback in formative assessment

    NARCIS (Netherlands)

    van der Kleij, Fabienne

    2013-01-01

    Formative assessment concerns any assessment that provides feedback that is intended to support learning and can be used by teachers and/or students. Computers could offer a solution to overcoming obstacles encountered in implementing formative assessment. For example, computer-based assessments

  5. Surface inspection system for industrial components based on shape from shading minimization approach

    Science.gov (United States)

    Kotan, Muhammed; Öz, Cemil

    2017-12-01

    An inspection system using estimated three-dimensional (3-D) surface characteristics information to detect and classify the faults to increase the quality control on the frequently used industrial components is proposed. Shape from shading (SFS) is one of the basic and classic 3-D shape recovery problems in computer vision. In our application, we developed a system using Frankot and Chellappa SFS method based on the minimization of the selected basis function. First, the specialized image acquisition system captured the images of the component. To eliminate noise, wavelet transform is applied to the taken images. Then, estimated gradients were used to obtain depth and surface profiles. Depth information was used to determine and classify the surface defects. Also, a comparison made with some linearization-based SFS algorithms was discussed. The developed system was applied to real products and the results indicated that using SFS approaches is useful and various types of defects can easily be detected in a short period of time.

  6. On the computation of molecular surface correlations for protein docking using fourier techniques.

    Science.gov (United States)

    Sakk, Eric

    2007-08-01

    The computation of surface correlations using a variety of molecular models has been applied to the unbound protein docking problem. Because of the computational complexity involved in examining all possible molecular orientations, the fast Fourier transform (FFT) (a fast numerical implementation of the discrete Fourier transform (DFT)) is generally applied to minimize the number of calculations. This approach is rooted in the convolution theorem which allows one to inverse transform the product of two DFTs in order to perform the correlation calculation. However, such a DFT calculation results in a cyclic or "circular" correlation which, in general, does not lead to the same result as the linear correlation desired for the docking problem. In this work, we provide computational bounds for constructing molecular models used in the molecular surface correlation problem. The derived bounds are then shown to be consistent with various intuitive guidelines previously reported in the protein docking literature. Finally, these bounds are applied to different molecular models in order to investigate their effect on the correlation calculation.

  7. An energy-based equilibrium contact angle boundary condition on jagged surfaces for phase-field methods.

    Science.gov (United States)

    Frank, Florian; Liu, Chen; Scanziani, Alessio; Alpak, Faruk O; Riviere, Beatrice

    2018-08-01

    We consider an energy-based boundary condition to impose an equilibrium wetting angle for the Cahn-Hilliard-Navier-Stokes phase-field model on voxel-set-type computational domains. These domains typically stem from μCT (micro computed tomography) imaging of porous rock and approximate a (on μm scale) smooth domain with a certain resolution. Planar surfaces that are perpendicular to the main axes are naturally approximated by a layer of voxels. However, planar surfaces in any other directions and curved surfaces yield a jagged/topologically rough surface approximation by voxels. For the standard Cahn-Hilliard formulation, where the contact angle between the diffuse interface and the domain boundary (fluid-solid interface/wall) is 90°, jagged surfaces have no impact on the contact angle. However, a prescribed contact angle smaller or larger than 90° on jagged voxel surfaces is amplified. As a remedy, we propose the introduction of surface energy correction factors for each fluid-solid voxel face that counterbalance the difference of the voxel-set surface area with the underlying smooth one. The discretization of the model equations is performed with the discontinuous Galerkin method. However, the presented semi-analytical approach of correcting the surface energy is equally applicable to other direct numerical methods such as finite elements, finite volumes, or finite differences, since the correction factors appear in the strong formulation of the model. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. A cortical edge-integration model of object-based lightness computation that explains effects of spatial context and individual differences

    Science.gov (United States)

    Rudd, Michael E.

    2014-01-01

    Previous work has demonstrated that perceived surface reflectance (lightness) can be modeled in simple contexts in a quantitatively exact way by assuming that the visual system first extracts information about local, directed steps in log luminance, then spatially integrates these steps along paths through the image to compute lightness (Rudd and Zemach, 2004, 2005, 2007). This method of computing lightness is called edge integration. Recent evidence (Rudd, 2013) suggests that human vision employs a default strategy to integrate luminance steps only along paths from a common background region to the targets whose lightness is computed. This implies a role for gestalt grouping in edge-based lightness computation. Rudd (2010) further showed the perceptual weights applied to edges in lightness computation can be influenced by the observer's interpretation of luminance steps as resulting from either spatial variation in surface reflectance or illumination. This implies a role for top-down factors in any edge-based model of lightness (Rudd and Zemach, 2005). Here, I show how the separate influences of grouping and attention on lightness can be modeled in tandem by a cortical mechanism that first employs top-down signals to spatially select regions of interest for lightness computation. An object-based network computation, involving neurons that code for border-ownership, then automatically sets the neural gains applied to edge signals surviving the earlier spatial selection stage. Only the borders that survive both processing stages are spatially integrated to compute lightness. The model assumptions are consistent with those of the cortical lightness model presented earlier by Rudd (2010, 2013), and with neurophysiological data indicating extraction of local edge information in V1, network computations to establish figure-ground relations and border ownership in V2, and edge integration to encode lightness and darkness signals in V4. PMID:25202253

  9. A Cortical Edge-integration Model of Object-Based Lightness Computation that Explains Effects of Spatial Context and Individual Differences

    Directory of Open Access Journals (Sweden)

    Michael E Rudd

    2014-08-01

    Full Text Available Previous work demonstrated that perceived surface reflectance (lightness can be modeled in simple contexts in a quantitatively exact way by assuming that the visual system first extracts information about local, directed steps in log luminance, then spatial integrates these steps along paths through the image to compute lightness (Rudd & Zemach, 2004, 2005, 2007. This method of computing lightness is called edge integration. Recent evidence (Rudd, 2013 suggests that the human vision employs a default strategy to integrate luminance steps only along paths from a common background region to the targets whose lightness is computed. This implies a role for gestalt grouping in edge-based lightness computation. Rudd (2010 further showed the perceptual weights applied to edges in lightness computation can be influenced by the observer’s interpretation of luminance steps as resulting from either spatial variation in surface reflectance or illumination. This implies a role for top-down factors in any edge-based model of lightness (Rudd & Zemach, 2005. Here, I show how the separate influences of grouping and attention on lightness can be together modeled by a cortical mechanism that first employs top-down signals to spatially select regions of interest for lightness computation. An object-based network computation, involving neurons that code for border-ownership, then automatically sets the neural gains applied to edge signals surviving the earlier spatial selection stage. Only the borders that survive both processing stages are spatially integrated to compute lightness. The model assumptions are consistent with those of the cortical lightness model presented earlier by Rudd (2010, 2013, and with neurophysiological data indicating extraction of local edge information in V1, network computations to establish figure-ground relations and border ownership in V2, and edge integration to encode lightness and darkness signals in V4.

  10. A cortical edge-integration model of object-based lightness computation that explains effects of spatial context and individual differences.

    Science.gov (United States)

    Rudd, Michael E

    2014-01-01

    Previous work has demonstrated that perceived surface reflectance (lightness) can be modeled in simple contexts in a quantitatively exact way by assuming that the visual system first extracts information about local, directed steps in log luminance, then spatially integrates these steps along paths through the image to compute lightness (Rudd and Zemach, 2004, 2005, 2007). This method of computing lightness is called edge integration. Recent evidence (Rudd, 2013) suggests that human vision employs a default strategy to integrate luminance steps only along paths from a common background region to the targets whose lightness is computed. This implies a role for gestalt grouping in edge-based lightness computation. Rudd (2010) further showed the perceptual weights applied to edges in lightness computation can be influenced by the observer's interpretation of luminance steps as resulting from either spatial variation in surface reflectance or illumination. This implies a role for top-down factors in any edge-based model of lightness (Rudd and Zemach, 2005). Here, I show how the separate influences of grouping and attention on lightness can be modeled in tandem by a cortical mechanism that first employs top-down signals to spatially select regions of interest for lightness computation. An object-based network computation, involving neurons that code for border-ownership, then automatically sets the neural gains applied to edge signals surviving the earlier spatial selection stage. Only the borders that survive both processing stages are spatially integrated to compute lightness. The model assumptions are consistent with those of the cortical lightness model presented earlier by Rudd (2010, 2013), and with neurophysiological data indicating extraction of local edge information in V1, network computations to establish figure-ground relations and border ownership in V2, and edge integration to encode lightness and darkness signals in V4.

  11. Evaluating the toxicity of TiO2-based nanoparticles to Chinese hamster ovary cells and Escherichia coli: a complementary experimental and computational approach

    Directory of Open Access Journals (Sweden)

    Alicja Mikolajczyk

    2017-10-01

    Full Text Available Titania-supported palladium, gold and bimetallic nanoparticles (second-generation nanoparticles demonstrate promising photocatalytic properties. However, due to unusual reactivity, second-generation nanoparticles can be hazardous for living organisms. Considering the ever-growing number of new types of nanoparticles that can potentially contaminate the environment, a determination of their toxicity is extremely important. The main aim of presented study was to investigate the cytotoxic effect of surface modified TiO2-based nanoparticles, to model their quantitative nanostructure–toxicity relationships and to reveal the toxicity mechanism. In this context, toxicity tests for surface-modified TiO2-based nanoparticles were performed in vitro, using Gram-negative bacteria Escherichia coli and Chinese hamster ovary (CHO-K1 cells. The obtained cytotoxicity data were analyzed by means of computational methods (quantitative structure–activity relationships, QSAR approach. Based on a combined experimental and computational approach, predictive models were developed, and relationships between cytotoxicity, size, and specific surface area (Brunauer–Emmett–Teller surface, BET of nanoparticles were discussed.

  12. Computational aeroelasticity using a pressure-based solver

    Science.gov (United States)

    Kamakoti, Ramji

    A computational methodology for performing fluid-structure interaction computations for three-dimensional elastic wing geometries is presented. The flow solver used is based on an unsteady Reynolds-Averaged Navier-Stokes (RANS) model. A well validated k-ε turbulence model with wall function treatment for near wall region was used to perform turbulent flow calculations. Relative merits of alternative flow solvers were investigated. The predictor-corrector-based Pressure Implicit Splitting of Operators (PISO) algorithm was found to be computationally economic for unsteady flow computations. Wing structure was modeled using Bernoulli-Euler beam theory. A fully implicit time-marching scheme (using the Newmark integration method) was used to integrate the equations of motion for structure. Bilinear interpolation and linear extrapolation techniques were used to transfer necessary information between fluid and structure solvers. Geometry deformation was accounted for by using a moving boundary module. The moving grid capability was based on a master/slave concept and transfinite interpolation techniques. Since computations were performed on a moving mesh system, the geometric conservation law must be preserved. This is achieved by appropriately evaluating the Jacobian values associated with each cell. Accurate computation of contravariant velocities for unsteady flows using the momentum interpolation method on collocated, curvilinear grids was also addressed. Flutter computations were performed for the AGARD 445.6 wing at subsonic, transonic and supersonic Mach numbers. Unsteady computations were performed at various dynamic pressures to predict the flutter boundary. Results showed favorable agreement of experiment and previous numerical results. The computational methodology exhibited capabilities to predict both qualitative and quantitative features of aeroelasticity.

  13. Simulated BRDF based on measured surface topography of metal

    Science.gov (United States)

    Yang, Haiyue; Haist, Tobias; Gronle, Marc; Osten, Wolfgang

    2017-06-01

    The radiative reflective properties of a calibration standard rough surface were simulated by ray tracing and the Finite-difference time-domain (FDTD) method. The simulation results have been used to compute the reflectance distribution functions (BRDF) of metal surfaces and have been compared with experimental measurements. The experimental and simulated results are in good agreement.

  14. Surface Acoustic Wave Tag-Based Coherence Multiplexing

    Science.gov (United States)

    Youngquist, Robert C. (Inventor); Malocha, Donald (Inventor); Saldanha, Nancy (Inventor)

    2016-01-01

    A surface acoustic wave (SAW)-based coherence multiplexing system includes SAW tags each including a SAW transducer, a first SAW reflector positioned a first distance from the SAW transducer and a second SAW reflector positioned a second distance from the SAW transducer. A transceiver including a wireless transmitter has a signal source providing a source signal and circuitry for transmitting interrogation pulses including a first and a second interrogation pulse toward the SAW tags, and a wireless receiver for receiving and processing response signals from the SAW tags. The receiver receives scrambled signals including a convolution of the wideband interrogation pulses with response signals from the SAW tags and includes a computing device which implements an algorithm that correlates the interrogation pulses or the source signal before transmitting against the scrambled signals to generate tag responses for each of the SAW tags.

  15. Computer Based Expert Systems.

    Science.gov (United States)

    Parry, James D.; Ferrara, Joseph M.

    1985-01-01

    Claims knowledge-based expert computer systems can meet needs of rural schools for affordable expert advice and support and will play an important role in the future of rural education. Describes potential applications in prediction, interpretation, diagnosis, remediation, planning, monitoring, and instruction. (NEC)

  16. Variabilities of Magnetic Resonance Imaging-, Computed Tomography-, and Positron Emission Tomography-Computed Tomography-Based Tumor and Lymph Node Delineations for Lung Cancer Radiation Therapy Planning.

    Science.gov (United States)

    Karki, Kishor; Saraiya, Siddharth; Hugo, Geoffrey D; Mukhopadhyay, Nitai; Jan, Nuzhat; Schuster, Jessica; Schutzer, Matthew; Fahrner, Lester; Groves, Robert; Olsen, Kathryn M; Ford, John C; Weiss, Elisabeth

    2017-09-01

    To investigate interobserver delineation variability for gross tumor volumes of primary lung tumors and associated pathologic lymph nodes using magnetic resonance imaging (MRI), and to compare the results with computed tomography (CT) alone- and positron emission tomography (PET)-CT-based delineations. Seven physicians delineated the tumor volumes of 10 patients for the following scenarios: (1) CT only, (2) PET-CT fusion images registered to CT ("clinical standard"), and (3) postcontrast T1-weighted MRI registered with diffusion-weighted MRI. To compute interobserver variability, the median surface was generated from all observers' contours and used as the reference surface. A physician labeled the interface types (tumor to lung, atelectasis (collapsed lung), hilum, mediastinum, or chest wall) on the median surface. Contoured volumes and bidirectional local distances between individual observers' contours and the reference contour were analyzed. Computed tomography- and MRI-based tumor volumes normalized relative to PET-CT-based volumes were 1.62 ± 0.76 (mean ± standard deviation) and 1.38 ± 0.44, respectively. Volume differences between the imaging modalities were not significant. Between observers, the mean normalized volumes per patient averaged over all patients varied significantly by a factor of 1.6 (MRI) and 2.0 (CT and PET-CT) (P=4.10 × 10 -5 to 3.82 × 10 -9 ). The tumor-atelectasis interface had a significantly higher variability than other interfaces for all modalities combined (P=.0006). The interfaces with the smallest uncertainties were tumor-lung (on CT) and tumor-mediastinum (on PET-CT and MRI). Although MRI-based contouring showed overall larger variability than PET-CT, contouring variability depended on the interface type and was not significantly different between modalities, despite the limited observer experience with MRI. Multimodality imaging and combining different imaging characteristics might be the best approach to define

  17. CSNS computing environment Based on OpenStack

    Science.gov (United States)

    Li, Yakang; Qi, Fazhi; Chen, Gang; Wang, Yanming; Hong, Jianshu

    2017-10-01

    Cloud computing can allow for more flexible configuration of IT resources and optimized hardware utilization, it also can provide computing service according to the real need. We are applying this computing mode to the China Spallation Neutron Source(CSNS) computing environment. So, firstly, CSNS experiment and its computing scenarios and requirements are introduced in this paper. Secondly, the design and practice of cloud computing platform based on OpenStack are mainly demonstrated from the aspects of cloud computing system framework, network, storage and so on. Thirdly, some improvments to openstack we made are discussed further. Finally, current status of CSNS cloud computing environment are summarized in the ending of this paper.

  18. Research on the design of surface acquisition system of active lap based on FPGA and FX2LP

    Science.gov (United States)

    Zhao, Hongshen; Li, Xiaojin; Fan, Bin; Zeng, Zhige

    2014-08-01

    In order to research the dynamic surface shape changes of active lap during the processing, this paper introduces a dynamic surface shape acquisition system of active lap using FPGA and USB communication. This system consists of high-precision micro-displacement sensor array, acquisition board, PC computer composition, and acquisition circuit board includes six sub-boards based on FPGA, a hub-board based on FPGA and USB communication. A sub-board is responsible for a number of independent channel sensors' data acquisition; hub-board is responsible for creating encoder simulation tools to active lap deformation control system with location information, sending synchronization information to latch the sensor data in all of the sub-boards for a time, while addressing the sub-boards to gather the sensor data in each sub-board one by one and transmitting all the sensor data together with location information via the USB chip FX2LP to the host computer. Experimental results show that the system is capable of fixing the location and speed of active lap, meanwhile the control of surface transforming and dynamic surface data acquisition at a certain location in the processing is implemented.

  19. Hybrid computational phantoms of the male and female newborn patient: NURBS-based whole-body models

    International Nuclear Information System (INIS)

    Lee, Choonsik; Lodwick, Daniel; Hasenauer, Deanna; Williams, Jonathan L; Lee, Choonik; Bolch, Wesley E

    2007-01-01

    Anthropomorphic computational phantoms are computer models of the human body for use in the evaluation of dose distributions resulting from either internal or external radiation sources. Currently, two classes of computational phantoms have been developed and widely utilized for organ dose assessment: (1) stylized phantoms and (2) voxel phantoms which describe the human anatomy via mathematical surface equations or 3D voxel matrices, respectively. Although stylized phantoms based on mathematical equations can be very flexible in regard to making changes in organ position and geometrical shape, they are limited in their ability to fully capture the anatomic complexities of human internal anatomy. In turn, voxel phantoms have been developed through image-based segmentation and correspondingly provide much better anatomical realism in comparison to simpler stylized phantoms. However, they themselves are limited in defining organs presented in low contrast within either magnetic resonance or computed tomography images-the two major sources in voxel phantom construction. By definition, voxel phantoms are typically constructed via segmentation of transaxial images, and thus while fine anatomic features are seen in this viewing plane, slice-to-slice discontinuities become apparent in viewing the anatomy of voxel phantoms in the sagittal or coronal planes. This study introduces the concept of a hybrid computational newborn phantom that takes full advantage of the best features of both its stylized and voxel counterparts: flexibility in phantom alterations and anatomic realism. Non-uniform rational B-spline (NURBS) surfaces, a mathematical modeling tool traditionally applied to graphical animation studies, was adopted to replace the limited mathematical surface equations of stylized phantoms. A previously developed whole-body voxel phantom of the newborn female was utilized as a realistic anatomical framework for hybrid phantom construction. The construction of a hybrid

  20. Nonlaser-based 3D surface imaging

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Shin-yee; Johnson, R.K.; Sherwood, R.J. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    3D surface imaging refers to methods that generate a 3D surface representation of objects of a scene under viewing. Laser-based 3D surface imaging systems are commonly used in manufacturing, robotics and biomedical research. Although laser-based systems provide satisfactory solutions for most applications, there are situations where non laser-based approaches are preferred. The issues that make alternative methods sometimes more attractive are: (1) real-time data capturing, (2) eye-safety, (3) portability, and (4) work distance. The focus of this presentation is on generating a 3D surface from multiple 2D projected images using CCD cameras, without a laser light source. Two methods are presented: stereo vision and depth-from-focus. Their applications are described.

  1. Computational Fluid Dynamics Modeling of Steam Condensation on Nuclear Containment Wall Surfaces Based on Semiempirical Generalized Correlations

    Directory of Open Access Journals (Sweden)

    Pavan K. Sharma

    2012-01-01

    Full Text Available In water-cooled nuclear power reactors, significant quantities of steam and hydrogen could be produced within the primary containment following the postulated design basis accidents (DBA or beyond design basis accidents (BDBA. For accurate calculation of the temperature/pressure rise and hydrogen transport calculation in nuclear reactor containment due to such scenarios, wall condensation heat transfer coefficient (HTC is used. In the present work, the adaptation of a commercial CFD code with the implementation of models for steam condensation on wall surfaces in presence of noncondensable gases is explained. Steam condensation has been modeled using the empirical average HTC, which was originally developed to be used for “lumped-parameter” (volume-averaged modeling of steam condensation in the presence of noncondensable gases. The present paper suggests a generalized HTC based on curve fitting of most of the reported semiempirical condensation models, which are valid for specific wall conditions. The present methodology has been validated against limited reported experimental data from the COPAIN experimental facility. This is the first step towards the CFD-based generalized analysis procedure for condensation modeling applicable for containment wall surfaces that is being evolved further for specific wall surfaces within the multicompartment containment atmosphere.

  2. Computer-Based Career Interventions.

    Science.gov (United States)

    Mau, Wei-Cheng

    The possible utilities and limitations of computer-assisted career guidance systems (CACG) have been widely discussed although the effectiveness of CACG has not been systematically considered. This paper investigates the effectiveness of a theory-based CACG program, integrating Sequential Elimination and Expected Utility strategies. Three types of…

  3. Music Learning Based on Computer Software

    OpenAIRE

    Baihui Yan; Qiao Zhou

    2017-01-01

    In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teach...

  4. Computer-based literature search in medical institutions in India

    Directory of Open Access Journals (Sweden)

    Kalita Jayantee

    2007-01-01

    Full Text Available Aim: To study the use of computer-based literature search and its application in clinical training and patient care as a surrogate marker of evidence-based medicine. Materials and Methods: A questionnaire comprising of questions on purpose (presentation, patient management, research, realm (site accessed, nature and frequency of search, effect, infrastructure, formal training in computer based literature search and suggestions for further improvement were sent to residents and faculty of a Postgraduate Medical Institute (PGI and a Medical College. The responses were compared amongst different subgroups of respondents. Results: Out of 300 subjects approached 194 responded; of whom 103 were from PGI and 91 from Medical College. There were 97 specialty residents, 58 super-specialty residents and 39 faculty members. Computer-based literature search was done at least once a month by 89% though there was marked variability in frequency and extent. The motivation for computer-based literature search was for presentation in 90%, research in 65% and patient management in 60.3%. The benefit of search was acknowledged in learning and teaching by 80%, research by 65% and patient care by 64.4% of respondents. Formal training in computer based literature search was received by 41% of whom 80% were residents. Residents from PGI did more frequent and more extensive computer-based literature search, which was attributed to better infrastructure and training. Conclusion: Training and infrastructure both are crucial for computer-based literature search, which may translate into evidence based medicine.

  5. Low-Computation Strategies for Extracting CO2 Emission Trends from Surface-Level Mixing Ratio Observations

    Science.gov (United States)

    Shusterman, A.; Kim, J.; Lieschke, K.; Newman, C.; Cohen, R. C.

    2017-12-01

    Global momentum is building for drastic, regulated reductions in greenhouse gas emissions over the coming decade. With this increasing regulation comes a clear need for increasingly sophisticated monitoring, reporting, and verification (MRV) strategies capable of enforcing and optimizing emissions-related policy, particularly as it applies to urban areas. Remote sensing and/or activity-based emission inventories can offer MRV insights for entire sectors or regions, but are not yet sophisticated enough to resolve unexpected trends in specific emitters. Urban surface monitors can offer the desired proximity to individual greenhouse gas sources, but due to the densely-packed nature of typical urban landscapes, surface observations are rarely representative of a single source. Most previous efforts to decompose these complex signals into their contributing emission processes have involved inverse atmospheric modeling techniques, which are computationally intensive and believed to depend heavily on poorly understood a priori estimates of error covariance. Here we present a number of transparent, low-computation approaches for extracting source-specific emissions estimates from signals with a variety of nearfield influences. Using observations from the first several years of the BErkeley Atmospheric CO2 Observation Network (BEACO2N), we demonstrate how to exploit strategic pairings of monitoring "nodes," anomalous wind conditions, and well-understood temporal variations to hone in on specific CO2 sources of interest. When evaluated against conventional, activity-based bottom-up emission inventories, these strategies are seen to generate quantitatively rigorous emission estimates. With continued application as the BEACO2N data set grows in time and space, these approaches offer a promising avenue for optimizing greenhouse gas mitigation strategies into the future.

  6. Minimization of gully erosion on reclaimed surface mines using the stable slope and sediment transport computer model

    International Nuclear Information System (INIS)

    McKenney, R.A.; Gardner, T.G.

    1992-01-01

    Disequilibrium between slope form and hydrologic and erosion processes on reclaimed surface coal mines in the humid temperate northeastern US, can result in gully erosion and sediment loads which are elevated above natural, background values. Initial sheetwash erosion is surpassed by gully erosion on reclamation sites which are not in equilibrium with post-mining hydrology. Long-term stability can be attained by designing a channel profile which is in equilibrium with the increased peak discharges found on reclaimed surface mines. The Stable Slope and Sediment transport model (SSAST) was developed to design stable longitudinal channel profiles for post-mining hydrologic and erosional processes. SSAST is an event based computer model that calculates the stable slope for a channel segment based on the post-mine hydrology and median grain size of a reclaimed surface mine. Peak discharge, which drives post-mine erosion, is calculated from a 10-year, 24-hour storm using the Soil Conservation Service curve number method. Curve number calibrated for Pennsylvania surface mines are used. Reclamation sites are represented by the rectangle of triangle which most closely fits the shape of the site while having the same drainage area and length. Sediment transport and slope stability are calculated using a modified Bagnold's equation with a correction factor for the irregular particle shapes formed during the mining process. Data from three reclaimed Pennsylvania surface mines were used to calibrate and verify SSAST. Analysis indicates that SSAST can predict longitudinal channel profiles for stable reclamation of surface mines in the humid, temperate northeastern US

  7. Computer-based control systems of nuclear power plants

    International Nuclear Information System (INIS)

    Kalashnikov, V.K.; Shugam, R.A.; Ol'shevsky, Yu.N.

    1975-01-01

    Computer-based control systems of nuclear power plants may be classified into those using computers for data acquisition only, those using computers for data acquisition and data processing, and those using computers for process control. In the present paper a brief review is given of the functions the systems above mentioned perform, their applications in different nuclear power plants, and some of their characteristics. The trend towards hierarchic systems using control computers with reserves already becomes clear when consideration is made of the control systems applied in the Canadian nuclear power plants that pertain to the first ones equipped with process computers. The control system being now under development for the large Soviet reactors of WWER type will also be based on the use of control computers. That part of the system concerned with controlling the reactor assembly is described in detail

  8. Basicities of Strong Bases in Water: A Computational Study

    OpenAIRE

    Kaupmees, Karl; Trummal, Aleksander; Leito, Ivo

    2014-01-01

    Aqueous pKa values of strong organic bases – DBU, TBD, MTBD, different phosphazene bases, etc – were computed with CPCM, SMD and COSMO-RS approaches. Explicit solvent molecules were not used. Direct computations and computations with reference pKa values were used. The latter were of two types: (1) reliable experimental aqueous pKa value of a reference base with structure similar to the investigated base or (2) reliable experimental pKa value in acetonitrile of the investigated base itself. ...

  9. Speckle noise reduction for computer generated holograms of objects with diffuse surfaces

    Science.gov (United States)

    Symeonidou, Athanasia; Blinder, David; Ahar, Ayyoub; Schretter, Colas; Munteanu, Adrian; Schelkens, Peter

    2016-04-01

    Digital holography is mainly used today for metrology and microscopic imaging and is emerging as an important potential technology for future holographic television. To generate the holographic content, computer-generated holography (CGH) techniques convert geometric descriptions of a 3D scene content. To model different surface types, an accurate model of light propagation has to be considered, including for example, specular and diffuse reflection. In previous work, we proposed a fast CGH method for point cloud data using multiple wavefront recording planes, look-up tables (LUTs) and occlusion processing. This work extends our method to account for diffuse reflections, enabling rendering of deep 3D scenes in high resolution with wide viewing angle support. This is achieved by modifying the spectral response of the light propagation kernels contained by the look-up tables. However, holograms encoding diffuse reflective surfaces depict significant amounts of speckle noise, a problem inherent to holography. Hence, techniques to improve the reduce speckle noise are evaluated in this paper. Moreover, we propose as well a technique to suppress the aperture diffraction during numerical, viewdependent rendering by apodizing the hologram. Results are compared visually and in terms of their respective computational efficiency. The experiments show that by modelling diffuse reflection in the LUTs, a more realistic yet computationally efficient framework for generating high-resolution CGH is achieved.

  10. Agent-Based Computational Modeling of Cell Culture ...

    Science.gov (United States)

    Quantitative characterization of cellular dose in vitro is needed for alignment of doses in vitro and in vivo. We used the agent-based software, CompuCell3D (CC3D), to provide a stochastic description of cell growth in culture. The model was configured so that isolated cells assumed a “fried egg shape” but became increasingly cuboidal with increasing confluency. The surface area presented by each cell to the overlying medium varies from cell-to-cell and is a determinant of diffusional flux of toxicant from the medium into the cell. Thus, dose varies among cells for a given concentration of toxicant in the medium. Computer code describing diffusion of H2O2 from medium into each cell and clearance of H2O2 was calibrated against H2O2 time-course data (25, 50, or 75 uM H2O2 for 60 min) obtained with the Amplex Red assay for the medium and the H2O2-sensitive fluorescent reporter, HyPer, for cytosol. Cellular H2O2 concentrations peaked at about 5 min and were near baseline by 10 min. The model predicted a skewed distribution of surface areas, with between cell variation usually 2 fold or less. Predicted variability in cellular dose was in rough agreement with the variation in the HyPer data. These results are preliminary, as the model was not calibrated to the morphology of a specific cell type. Future work will involve morphology model calibration against human bronchial epithelial (BEAS-2B) cells. Our results show, however, the potential of agent-based modeling

  11. MO-FG-CAMPUS-TeP1-03: Pre-Treatment Surface Imaging Based Collision Detection

    Energy Technology Data Exchange (ETDEWEB)

    Wiant, D; Maurer, J; Liu, H; Hayes, T; Shang, Q; Sintay, B [Cone Health Cancer Center, Greensboro, NC (United States)

    2016-06-15

    Purpose: Modern radiotherapy increasingly employs large immobilization devices, gantry attachments, and couch rotations for treatments. All of which raise the risk of collisions between the patient and the gantry / couch. Collision detection is often achieved by manually checking each couch position in the treatment room and sometimes results in extraneous imaging if collisions are detected after image based setup has begun. In the interest of improving efficiency and avoiding extra imaging, we explore the use of a surface imaging based collision detection model. Methods: Surfaces acquired from AlignRT (VisionRT, London, UK) were transferred in wavefront format to a custom Matlab (Mathworks, Natick, MA) software package (CCHECK). Computed tomography (CT) scans acquired at the same time were sent to CCHECK in DICOM format. In CCHECK, binary maps of the surfaces were created and overlaid on the CT images based on the fixed relationship of the AlignRT and CT coordinate systems. Isocenters were added through a graphical user interface (GUI). CCHECK then compares the inputted surfaces to a model of the linear accelerator (linac) to check for collisions at defined gantry and couch positions. Note, CCHECK may be used with or without a CT. Results: The nominal surface image field of view is 650 mm × 900 mm, with variance based on patient position and size. The accuracy of collision detections is primarily based on the linac model and the surface mapping process. The current linac model and mapping process yield detection accuracies on the order of 5 mm, assuming no change in patient posture between surface acquisition and treatment. Conclusions: CCHECK provides a non-ionizing method to check for collisions without the patient in the treatment room. Collision detection accuracy may be improved with more robust linac modeling. Additional gantry attachments (e.g. conical collimators) can be easily added to the model.

  12. Secure Data Access Control for Fog Computing Based on Multi-Authority Attribute-Based Signcryption with Computation Outsourcing and Attribute Revocation.

    Science.gov (United States)

    Xu, Qian; Tan, Chengxiang; Fan, Zhijie; Zhu, Wenye; Xiao, Ya; Cheng, Fujia

    2018-05-17

    Nowadays, fog computing provides computation, storage, and application services to end users in the Internet of Things. One of the major concerns in fog computing systems is how fine-grained access control can be imposed. As a logical combination of attribute-based encryption and attribute-based signature, Attribute-based Signcryption (ABSC) can provide confidentiality and anonymous authentication for sensitive data and is more efficient than traditional "encrypt-then-sign" or "sign-then-encrypt" strategy. Thus, ABSC is suitable for fine-grained access control in a semi-trusted cloud environment and is gaining more and more attention recently. However, in many existing ABSC systems, the computation cost required for the end users in signcryption and designcryption is linear with the complexity of signing and encryption access policy. Moreover, only a single authority that is responsible for attribute management and key generation exists in the previous proposed ABSC schemes, whereas in reality, mostly, different authorities monitor different attributes of the user. In this paper, we propose OMDAC-ABSC, a novel data access control scheme based on Ciphertext-Policy ABSC, to provide data confidentiality, fine-grained control, and anonymous authentication in a multi-authority fog computing system. The signcryption and designcryption overhead for the user is significantly reduced by outsourcing the undesirable computation operations to fog nodes. The proposed scheme is proven to be secure in the standard model and can provide attribute revocation and public verifiability. The security analysis, asymptotic complexity comparison, and implementation results indicate that our construction can balance the security goals with practical efficiency in computation.

  13. Computer-based learning for the enhancement of breastfeeding ...

    African Journals Online (AJOL)

    In this study, computer-based learning (CBL) was explored in the context of breastfeeding training for undergraduate Dietetic students. Aim: To adapt and validate an Indian computer-based undergraduate breastfeeding training module for use by South African undergraduate Dietetic students. Methods and materials: The ...

  14. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    Science.gov (United States)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  15. Metadyn View: Fast web-based viewer of free energy surfaces calculated by metadynamics

    Science.gov (United States)

    Hošek, Petr; Spiwok, Vojtěch

    2016-01-01

    Metadynamics is a highly successful enhanced sampling technique for simulation of molecular processes and prediction of their free energy surfaces. An in-depth analysis of data obtained by this method is as important as the simulation itself. Although there are several tools to compute free energy surfaces from metadynamics data, they usually lack user friendliness and a build-in visualization part. Here we introduce Metadyn View as a fast and user friendly viewer of bias potential/free energy surfaces calculated by metadynamics in Plumed package. It is based on modern web technologies including HTML5, JavaScript and Cascade Style Sheets (CSS). It can be used by visiting the web site and uploading a HILLS file. It calculates the bias potential/free energy surface on the client-side, so it can run online or offline without necessity to install additional web engines. Moreover, it includes tools for measurement of free energies and free energy differences and data/image export.

  16. Computation of 3D steady Navier-Stokes flow with free-surface gravity waves

    NARCIS (Netherlands)

    Lewis, M.R.; Koren, B.; Raven, H.C.; Armfield, S.; Morgan, P.; Srinivas, K,

    2003-01-01

    In this paper an iterative method for the computation of stationary gravity-wave solutions is investigated, using a novel formulation of the free-surface (FS) boundary-value problem. This method requires the solution of a sequence of stationary Reynolds-Averaged Navier-Stokes subproblems employing

  17. Computation of 3D steady Navier-Stokes flow with free-surface gravity waves

    NARCIS (Netherlands)

    M.R. Lewis; B. Koren (Barry); H.C. Raven

    2003-01-01

    textabstractIn this paper an iterative method for the computation of stationary gravity-wave solutions is investigated, using a novel formulation of the free-surface (FS) boundary-value problem. This method requires the solution of a sequence of stationary Reynolds-Averaged Navier-Stokes subproblems

  18. Computational mesh generation for vascular structures with deformable surfaces

    International Nuclear Information System (INIS)

    Putter, S. de; Laffargue, F.; Breeuwer, M.; Vosse, F.N. van de; Gerritsen, F.A.; Philips Medical Systems, Best

    2006-01-01

    Computational blood flow and vessel wall mechanics simulations for vascular structures are becoming an important research tool for patient-specific surgical planning and intervention. An important step in the modelling process for patient-specific simulations is the creation of the computational mesh based on the segmented geometry. Most known solutions either require a large amount of manual processing or lead to a substantial difference between the segmented object and the actual computational domain. We have developed a chain of algorithms that lead to a closely related implementation of image segmentation with deformable models and 3D mesh generation. The resulting processing chain is very robust and leads both to an accurate geometrical representation of the vascular structure as well as high quality computational meshes. The chain of algorithms has been tested on a wide variety of shapes. A benchmark comparison of our mesh generation application with five other available meshing applications clearly indicates that the new approach outperforms the existing methods in the majority of cases. (orig.)

  19. Layered architecture for quantum computing

    OpenAIRE

    Jones, N. Cody; Van Meter, Rodney; Fowler, Austin G.; McMahon, Peter L.; Kim, Jungsang; Ladd, Thaddeus D.; Yamamoto, Yoshihisa

    2010-01-01

    We develop a layered quantum-computer architecture, which is a systematic framework for tackling the individual challenges of developing a quantum computer while constructing a cohesive device design. We discuss many of the prominent techniques for implementing circuit-model quantum computing and introduce several new methods, with an emphasis on employing surface-code quantum error correction. In doing so, we propose a new quantum-computer architecture based on optical control of quantum dot...

  20. Transitions in the computational power of thermal states for measurement-based quantum computation

    International Nuclear Information System (INIS)

    Barrett, Sean D.; Bartlett, Stephen D.; Jennings, David; Doherty, Andrew C.; Rudolph, Terry

    2009-01-01

    We show that the usefulness of the thermal state of a specific spin-lattice model for measurement-based quantum computing exhibits a transition between two distinct 'phases' - one in which every state is a universal resource for quantum computation, and another in which any local measurement sequence can be simulated efficiently on a classical computer. Remarkably, this transition in computational power does not coincide with any phase transition, classical, or quantum in the underlying spin-lattice model.

  1. An efficient approach for computing the geometrical optics field reflected from a numerically specified surface

    Science.gov (United States)

    Mittra, R.; Rushdi, A.

    1979-01-01

    An approach for computing the geometrical optic fields reflected from a numerically specified surface is presented. The approach includes the step of deriving a specular point and begins with computing the reflected rays off the surface at the points where their coordinates, as well as the partial derivatives (or equivalently, the direction of the normal), are numerically specified. Then, a cluster of three adjacent rays are chosen to define a 'mean ray' and the divergence factor associated with this mean ray. Finally, the ampilitude, phase, and vector direction of the reflected field at a given observation point are derived by associating this point with the nearest mean ray and determining its position relative to such a ray.

  2. APC: A New Code for Atmospheric Polarization Computations

    Science.gov (United States)

    Korkin, Sergey V.; Lyapustin, Alexei I.; Rozanov, Vladimir V.

    2014-01-01

    A new polarized radiative transfer code Atmospheric Polarization Computations (APC) is described. The code is based on separation of the diffuse light field into anisotropic and smooth (regular) parts. The anisotropic part is computed analytically. The smooth regular part is computed numerically using the discrete ordinates method. Vertical stratification of the atmosphere, common types of bidirectional surface reflection and scattering by spherical particles or spheroids are included. A particular consideration is given to computation of the bidirectional polarization distribution function (BPDF) of the waved ocean surface.

  3. Computational anatomy based on whole body imaging basic principles of computer-assisted diagnosis and therapy

    CERN Document Server

    Masutani, Yoshitaka

    2017-01-01

    This book deals with computational anatomy, an emerging discipline recognized in medical science as a derivative of conventional anatomy. It is also a completely new research area on the boundaries of several sciences and technologies, such as medical imaging, computer vision, and applied mathematics. Computational Anatomy Based on Whole Body Imaging highlights the underlying principles, basic theories, and fundamental techniques in computational anatomy, which are derived from conventional anatomy, medical imaging, computer vision, and applied mathematics, in addition to various examples of applications in clinical data. The book will cover topics on the basics and applications of the new discipline. Drawing from areas in multidisciplinary fields, it provides comprehensive, integrated coverage of innovative approaches to computational anatomy. As well,Computational Anatomy Based on Whole Body Imaging serves as a valuable resource for researchers including graduate students in the field and a connection with ...

  4. Effect of denture cleaning on abrasion resistance and surface topography of polymerized CAD CAM acrylic resin denture base.

    Science.gov (United States)

    Shinawi, Lana Ahmed

    2017-05-01

    The application of computer-aided design computer-aided manufacturing (CAD CAM) technology in the fabrication of complete dentures, offers numerous advantages as it provides optimum fit and eliminates polymerization shrinkage of the acrylic base. Additionally, the porosity and surface roughness of CAD CAM resins is less compared to conventionally processed resins which leads to a decrease in the adhesion of bacteria on the denture base, which is associated with many conditions including halitosis and aspiration pneumonia in elderly denture wearers. To evaluate the influence of tooth brushing with dentifrices on CAD CAM resin blocks in terms of abrasion resistance, surface roughness and scanning electron photomicrography. This experimental study was carried out at the Faculty of Dentistry of King Abdulaziz University during 2016. A total of 40 rectangular shaped polymerized CAD CAM resin samples were subjected to 40.000 and 60.000 brushing strokes under a 200-gram vertical load simulating three years of tooth brushing strokes using commercially available denture cleaning dentifrice. Data were analyzed by SPSS version 20, using descriptive statistics and ANOVA. ANOVA test revealed a statistical significant weight loss of CAD CAM acrylic resin denture base specimens following 40.000 and 60.000 brushing strokes as well as a statistical significant change (p=0.0.5) in the surface roughness following brushing. The CAD CAM resin samples SEM baseline imaging revealed a relatively smooth homogenous surface, but following 40,000 and 60,000 brushing strokes, imaging displayed the presence of small scratches on the surface. CAD CAM resin displayed a homogenous surface initially with low surface roughness that was significantly affected following simulating three years of manual brushing, but despite the significant weight loss, the findings are within the clinically acceptable limits.

  5. Computational Modeling of Bloch Surface Waves in One-Dimensional Periodic and Aperiodic Multilayer Structures

    Science.gov (United States)

    Koju, Vijay

    Photonic crystals and their use in exciting Bloch surface waves have received immense attention over the past few decades. This interest is mainly due to their applications in bio-sensing, wave-guiding, and other optical phenomena such as surface field enhanced Raman spectroscopy. Improvement in numerical modeling techniques, state of the art computing resources, and advances in fabrication techniques have also assisted in growing interest in this field. The ability to model photonic crystals computationally has benefited both the theoretical as well as experimental communities. It helps the theoretical physicists in solving complex problems which cannot be solved analytically and helps to acquire useful insights that cannot be obtained otherwise. Experimentalists, on the other hand, can test different variants of their devices by changing device parameters to optimize performance before fabrication. In this dissertation, we develop two commonly used numerical techniques, namely transfer matrix method, and rigorous coupled wave analysis, in C++ and MATLAB, and use two additional software packages, one open-source and another commercial, to model one-dimensional photonic crystals. Different variants of one-dimensional multilayered structures such as perfectly periodic dielectric multilayers, quasicrystals, aperiodic multilayer are modeled, along with one-dimensional photonic crystals with gratings on the top layer. Applications of Bloch surface waves, along with new and novel aperiodic dielectric multilayer structures that support Bloch surface waves are explored in this dissertation. We demonstrate a slow light configuration that makes use of Bloch Surface Waves as an intermediate excitation in a double-prism tunneling configuration. This method is simple compared to the more usual techniques for slowing light using the phenomenon of electromagnetically induced transparency in atomic gases or doped ionic crystals operated at temperatures below 4K. Using a semi

  6. Interactive Computer-Assisted Instruction in Acid-Base Physiology for Mobile Computer Platforms

    Science.gov (United States)

    Longmuir, Kenneth J.

    2014-01-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ~20 screens of information, on the subjects…

  7. Reliability-Based Topology Optimization Using Stochastic Response Surface Method with Sparse Grid Design

    Directory of Open Access Journals (Sweden)

    Qinghai Zhao

    2015-01-01

    Full Text Available A mathematical framework is developed which integrates the reliability concept into topology optimization to solve reliability-based topology optimization (RBTO problems under uncertainty. Two typical methodologies have been presented and implemented, including the performance measure approach (PMA and the sequential optimization and reliability assessment (SORA. To enhance the computational efficiency of reliability analysis, stochastic response surface method (SRSM is applied to approximate the true limit state function with respect to the normalized random variables, combined with the reasonable design of experiments generated by sparse grid design, which was proven to be an effective and special discretization technique. The uncertainties such as material property and external loads are considered on three numerical examples: a cantilever beam, a loaded knee structure, and a heat conduction problem. Monte-Carlo simulations are also performed to verify the accuracy of the failure probabilities computed by the proposed approach. Based on the results, it is demonstrated that application of SRSM with SGD can produce an efficient reliability analysis in RBTO which enables a more reliable design than that obtained by DTO. It is also found that, under identical accuracy, SORA is superior to PMA in view of computational efficiency.

  8. Concordance-based Kendall's Correlation for Computationally-Light vs. Computationally-Heavy Centrality Metrics: Lower Bound for Correlation

    Directory of Open Access Journals (Sweden)

    Natarajan Meghanathan

    2017-01-01

    Full Text Available We identify three different levels of correlation (pair-wise relative ordering, network-wide ranking and linear regression that could be assessed between a computationally-light centrality metric and a computationally-heavy centrality metric for real-world networks. The Kendall's concordance-based correlation measure could be used to quantitatively assess how well we could consider the relative ordering of two vertices vi and vj with respect to a computationally-light centrality metric as the relative ordering of the same two vertices with respect to a computationally-heavy centrality metric. We hypothesize that the pair-wise relative ordering (concordance-based assessment of the correlation between centrality metrics is the most strictest of all the three levels of correlation and claim that the Kendall's concordance-based correlation coefficient will be lower than the correlation coefficient observed with the more relaxed levels of correlation measures (linear regression-based Pearson's product-moment correlation coefficient and the network wide ranking-based Spearman's correlation coefficient. We validate our hypothesis by evaluating the three correlation coefficients between two sets of centrality metrics: the computationally-light degree and local clustering coefficient complement-based degree centrality metrics and the computationally-heavy eigenvector centrality, betweenness centrality and closeness centrality metrics for a diverse collection of 50 real-world networks.

  9. Computer-Based Simulation Games in Public Administration Education

    Directory of Open Access Journals (Sweden)

    Kutergina Evgeniia

    2017-12-01

    Full Text Available Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently in Russia the use of computer-based simulation games in Master of Public Administration (MPA curricula is quite limited. Th is paper focuses on computer- based simulation games for students of MPA programmes. Our aim was to analyze outcomes of implementing such games in MPA curricula. We have done so by (1 developing three computer-based simulation games about allocating public finances, (2 testing the games in the learning process, and (3 conducting a posttest examination to evaluate the effect of simulation games on students’ knowledge of municipal finances. Th is study was conducted in the National Research University Higher School of Economics (HSE and in the Russian Presidential Academy of National Economy and Public Administration (RANEPA during the period of September to December 2015, in Saint Petersburg, Russia. Two groups of students were randomly selected in each university and then randomly allocated either to the experimental or the control group. In control groups (n=12 in HSE, n=13 in RANEPA students had traditional lectures. In experimental groups (n=12 in HSE, n=13 in RANEPA students played three simulation games apart from traditional lectures. Th is exploratory research shows that the use of computer-based simulation games in MPA curricula can improve students’ outcomes by 38 %. In general, the experimental groups had better performances on the post-test examination (Figure 2. Students in the HSE experimental group had 27.5 % better

  10. Evaluation of interatomic potentials for rainbow scattering under axial channeling at KCl(0 0 1) surface by three-dimensional computer simulations based on binary collision approximation

    Energy Technology Data Exchange (ETDEWEB)

    Takeuchi, Wataru, E-mail: take@sp.ous.ac.jp

    2017-05-01

    The rainbow angles corresponding to prominent peaks in the angular distributions of scattered projectiles with small angle, attributed to rainbow scattering (RS), under axial surface channeling conditions are strongly influenced by the interatomic potentials between projectiles and target atoms. The dependence of rainbow angles on normal energy of projectile energy to the target surface, being experimentally obtained by Specht et al. for RS of He, N, Ne and Ar atoms under 〈1 0 0〉 and 〈1 1 0〉 axial channeling conditions at a KCl(0 0 1) surface with projectile energies of 1–60 keV, was evaluated by the three-dimensional computer simulations using the ACOCT code based on the binary collision approximation with interatomic pair potentials. Good agreement between the ACOCT results using the ZBL pair potential and the individual pair potentials calculated from Hartree-Fock (HF) wave functions and the experimental ones was found for RS of He, N and Ne atoms from the atomic rows along 〈1 0 0〉 direction. For 〈1 1 0〉 direction, the ACOCT results employing the Moliere pair potential with adjustable screening length of O’Connor-Biersack (OB) formula, the ZBL pair potential and the individual HF pair potentials except for Ar → KCl using the OB pair potential are nearly in agreement with the experimental ones.

  11. Game based learning for computer science education

    NARCIS (Netherlands)

    Schmitz, Birgit; Czauderna, André; Klemke, Roland; Specht, Marcus

    2011-01-01

    Schmitz, B., Czauderna, A., Klemke, R., & Specht, M. (2011). Game based learning for computer science education. In G. van der Veer, P. B. Sloep, & M. van Eekelen (Eds.), Computer Science Education Research Conference (CSERC '11) (pp. 81-86). Heerlen, The Netherlands: Open Universiteit.

  12. Pulmonary lobe segmentation based on ridge surface sampling and shape model fitting

    Energy Technology Data Exchange (ETDEWEB)

    Ross, James C., E-mail: jross@bwh.harvard.edu [Channing Laboratory, Brigham and Women' s Hospital, Boston, Massachusetts 02215 (United States); Surgical Planning Lab, Brigham and Women' s Hospital, Boston, Massachusetts 02215 (United States); Laboratory of Mathematics in Imaging, Brigham and Women' s Hospital, Boston, Massachusetts 02126 (United States); Kindlmann, Gordon L. [Computer Science Department and Computation Institute, University of Chicago, Chicago, Illinois 60637 (United States); Okajima, Yuka; Hatabu, Hiroto [Department of Radiology, Brigham and Women' s Hospital, Boston, Massachusetts 02215 (United States); Díaz, Alejandro A. [Pulmonary and Critical Care Division, Brigham and Women' s Hospital and Harvard Medical School, Boston, Massachusetts 02215 and Department of Pulmonary Diseases, Pontificia Universidad Católica de Chile, Santiago (Chile); Silverman, Edwin K. [Channing Laboratory, Brigham and Women' s Hospital, Boston, Massachusetts 02215 and Pulmonary and Critical Care Division, Brigham and Women' s Hospital and Harvard Medical School, Boston, Massachusetts 02215 (United States); Washko, George R. [Pulmonary and Critical Care Division, Brigham and Women' s Hospital and Harvard Medical School, Boston, Massachusetts 02215 (United States); Dy, Jennifer [ECE Department, Northeastern University, Boston, Massachusetts 02115 (United States); Estépar, Raúl San José [Department of Radiology, Brigham and Women' s Hospital, Boston, Massachusetts 02215 (United States); Surgical Planning Lab, Brigham and Women' s Hospital, Boston, Massachusetts 02215 (United States); Laboratory of Mathematics in Imaging, Brigham and Women' s Hospital, Boston, Massachusetts 02126 (United States)

    2013-12-15

    Purpose: Performing lobe-based quantitative analysis of the lung in computed tomography (CT) scans can assist in efforts to better characterize complex diseases such as chronic obstructive pulmonary disease (COPD). While airways and vessels can help to indicate the location of lobe boundaries, segmentations of these structures are not always available, so methods to define the lobes in the absence of these structures are desirable. Methods: The authors present a fully automatic lung lobe segmentation algorithm that is effective in volumetric inspiratory and expiratory computed tomography (CT) datasets. The authors rely on ridge surface image features indicating fissure locations and a novel approach to modeling shape variation in the surfaces defining the lobe boundaries. The authors employ a particle system that efficiently samples ridge surfaces in the image domain and provides a set of candidate fissure locations based on the Hessian matrix. Following this, lobe boundary shape models generated from principal component analysis (PCA) are fit to the particles data to discriminate between fissure and nonfissure candidates. The resulting set of particle points are used to fit thin plate spline (TPS) interpolating surfaces to form the final boundaries between the lung lobes. Results: The authors tested algorithm performance on 50 inspiratory and 50 expiratory CT scans taken from the COPDGene study. Results indicate that the authors' algorithm performs comparably to pulmonologist-generated lung lobe segmentations and can produce good results in cases with accessory fissures, incomplete fissures, advanced emphysema, and low dose acquisition protocols. Dice scores indicate that only 29 out of 500 (5.85%) lobes showed Dice scores lower than 0.9. Two different approaches for evaluating lobe boundary surface discrepancies were applied and indicate that algorithm boundary identification is most accurate in the vicinity of fissures detectable on CT. Conclusions: The

  13. Pulmonary lobe segmentation based on ridge surface sampling and shape model fitting

    International Nuclear Information System (INIS)

    Ross, James C.; Kindlmann, Gordon L.; Okajima, Yuka; Hatabu, Hiroto; Díaz, Alejandro A.; Silverman, Edwin K.; Washko, George R.; Dy, Jennifer; Estépar, Raúl San José

    2013-01-01

    Purpose: Performing lobe-based quantitative analysis of the lung in computed tomography (CT) scans can assist in efforts to better characterize complex diseases such as chronic obstructive pulmonary disease (COPD). While airways and vessels can help to indicate the location of lobe boundaries, segmentations of these structures are not always available, so methods to define the lobes in the absence of these structures are desirable. Methods: The authors present a fully automatic lung lobe segmentation algorithm that is effective in volumetric inspiratory and expiratory computed tomography (CT) datasets. The authors rely on ridge surface image features indicating fissure locations and a novel approach to modeling shape variation in the surfaces defining the lobe boundaries. The authors employ a particle system that efficiently samples ridge surfaces in the image domain and provides a set of candidate fissure locations based on the Hessian matrix. Following this, lobe boundary shape models generated from principal component analysis (PCA) are fit to the particles data to discriminate between fissure and nonfissure candidates. The resulting set of particle points are used to fit thin plate spline (TPS) interpolating surfaces to form the final boundaries between the lung lobes. Results: The authors tested algorithm performance on 50 inspiratory and 50 expiratory CT scans taken from the COPDGene study. Results indicate that the authors' algorithm performs comparably to pulmonologist-generated lung lobe segmentations and can produce good results in cases with accessory fissures, incomplete fissures, advanced emphysema, and low dose acquisition protocols. Dice scores indicate that only 29 out of 500 (5.85%) lobes showed Dice scores lower than 0.9. Two different approaches for evaluating lobe boundary surface discrepancies were applied and indicate that algorithm boundary identification is most accurate in the vicinity of fissures detectable on CT. Conclusions: The proposed

  14. Computed potential energy surfaces for chemical reactions

    Science.gov (United States)

    Walch, Stephen P.

    1988-01-01

    The minimum energy path for the addition of a hydrogen atom to N2 is characterized in CASSCF/CCI calculations using the (4s3p2d1f/3s2p1d) basis set, with additional single point calculations at the stationary points of the potential energy surface using the (5s4p3d2f/4s3p2d) basis set. These calculations represent the most extensive set of ab initio calculations completed to date, yielding a zero point corrected barrier for HN2 dissociation of approx. 8.5 kcal mol/1. The lifetime of the HN2 species is estimated from the calculated geometries and energetics using both conventional Transition State Theory and a method which utilizes an Eckart barrier to compute one dimensional quantum mechanical tunneling effects. It is concluded that the lifetime of the HN2 species is very short, greatly limiting its role in both termolecular recombination reactions and combustion processes.

  15. Women and Computer Based Technologies: A Feminist Perspective.

    Science.gov (United States)

    Morritt, Hope

    The use of computer based technologies by professional women in education is examined through a feminist standpoint theory in this paper. The theory is grounded in eight claims which form the basis of the conceptual framework for the study. The experiences of nine women participants with computer based technologies were categorized using three…

  16. Acid base properties of a goethite surface model: A theoretical view

    Science.gov (United States)

    Aquino, Adelia J. A.; Tunega, Daniel; Haberhauer, Georg; Gerzabek, Martin H.; Lischka, Hans

    2008-08-01

    Density functional theory is used to compute the effect of protonation, deprotonation, and dehydroxylation of different reactive sites of a goethite surface modeled as a cluster containing six iron atoms constructed from a slab model of the (1 1 0) goethite surface. Solvent effects were treated at two different levels: (i) by inclusion of up to six water molecules explicitly into the quantum chemical calculation and (ii) by using additionally a continuum solvation model for the long-range interactions. Systematic studies were made in order to test the limit of the fully hydrated cluster surfaces by a monomolecular water layer. The main finding is that from the three different types of surface hydroxyl groups (hydroxo, μ-hydroxo, and μ 3-hydroxo), the hydroxo group is most active for protonation whereas μ- and μ 3-hydroxo sites undergo deprotonation more easily. Proton affinity constants (p Ka values) were computed from appropriate protonation/deprotonation reactions for all sites investigated and compared to results obtained from the multisite complexation model (MUSIC). The approach used was validated for the consecutive deprotonation reactions of the [Fe(H 2O) 6] 3+ complex in solution and good agreement between calculated and experimental p Ka values was found. The computed p Ka for all sites of the modeled goethite surface were used in the prediction of the pristine point of zero charge, pH PPZN. The obtained value of 9.1 fits well with published experimental values of 7.0-9.5.

  17. An overview of computer-based natural language processing

    Science.gov (United States)

    Gevarter, W. B.

    1983-01-01

    Computer based Natural Language Processing (NLP) is the key to enabling humans and their computer based creations to interact with machines in natural language (like English, Japanese, German, etc., in contrast to formal computer languages). The doors that such an achievement can open have made this a major research area in Artificial Intelligence and Computational Linguistics. Commercial natural language interfaces to computers have recently entered the market and future looks bright for other applications as well. This report reviews the basic approaches to such systems, the techniques utilized, applications, the state of the art of the technology, issues and research requirements, the major participants and finally, future trends and expectations. It is anticipated that this report will prove useful to engineering and research managers, potential users, and others who will be affected by this field as it unfolds.

  18. Surface electrostatics: theory and computations

    KAUST Repository

    Chatzigeorgiou, G.

    2014-02-05

    The objective of this work is to study the electrostatic response of materials accounting for boundary surfaces with their own (electrostatic) constitutive behaviour. The electric response of materials with (electrostatic) energetic boundary surfaces (surfaces that possess material properties and constitutive structures different from those of the bulk) is formulated in a consistent manner using a variational framework. The forces and moments that appear due to bulk and surface electric fields are also expressed in a consistent manner. The theory is accompanied by numerical examples on porous materials using the finite-element method, where the influence of the surface electric permittivity on the electric displacement, the polarization stress and the Maxwell stress is examined.

  19. Experimental/Computational Approach to Accommodation Coefficients and its Application to Noble Gases on Aluminum Surface (Preprint)

    Science.gov (United States)

    2009-02-03

    computational approach to accommodation coefficients and its application to noble gases on aluminum surface Nathaniel Selden Uruversity of Southern Cahfornia, Los ...8217 ,. 0.’ a~ .......,..,P. • " ,,-0, "p"’U".. ,Po"D.’ 0.’P.... uro . P." FIG. 5: Experimental and computed radiometri~ force for argon (left), xenon

  20. Strategic Planning for Computer-Based Educational Technology.

    Science.gov (United States)

    Bozeman, William C.

    1984-01-01

    Offers educational practitioners direction for the development of a master plan for the implementation and application of computer-based educational technology by briefly examining computers in education, discussing organizational change from a theoretical perspective, and presenting an overview of the planning strategy known as the planning and…

  1. PL-PatchSurfer: A Novel Molecular Local Surface-Based Method for Exploring Protein-Ligand Interactions

    Directory of Open Access Journals (Sweden)

    Bingjie Hu

    2014-08-01

    Full Text Available Structure-based computational methods have been widely used in exploring protein-ligand interactions, including predicting the binding ligands of a given protein based on their structural complementarity. Compared to other protein and ligand representations, the advantages of a surface representation include reduced sensitivity to subtle changes in the pocket and ligand conformation and fast search speed. Here we developed a novel method named PL-PatchSurfer (Protein-Ligand PatchSurfer. PL-PatchSurfer represents the protein binding pocket and the ligand molecular surface as a combination of segmented surface patches. Each patch is characterized by its geometrical shape and the electrostatic potential, which are represented using the 3D Zernike descriptor (3DZD. We first tested PL-PatchSurfer on binding ligand prediction and found it outperformed the pocket-similarity based ligand prediction program. We then optimized the search algorithm of PL-PatchSurfer using the PDBbind dataset. Finally, we explored the utility of applying PL-PatchSurfer to a larger and more diverse dataset and showed that PL-PatchSurfer was able to provide a high early enrichment for most of the targets. To the best of our knowledge, PL-PatchSurfer is the first surface patch-based method that treats ligand complementarity at protein binding sites. We believe that using a surface patch approach to better understand protein-ligand interactions has the potential to significantly enhance the design of new ligands for a wide array of drug-targets.

  2. PL-PatchSurfer: a novel molecular local surface-based method for exploring protein-ligand interactions.

    Science.gov (United States)

    Hu, Bingjie; Zhu, Xiaolei; Monroe, Lyman; Bures, Mark G; Kihara, Daisuke

    2014-08-27

    Structure-based computational methods have been widely used in exploring protein-ligand interactions, including predicting the binding ligands of a given protein based on their structural complementarity. Compared to other protein and ligand representations, the advantages of a surface representation include reduced sensitivity to subtle changes in the pocket and ligand conformation and fast search speed. Here we developed a novel method named PL-PatchSurfer (Protein-Ligand PatchSurfer). PL-PatchSurfer represents the protein binding pocket and the ligand molecular surface as a combination of segmented surface patches. Each patch is characterized by its geometrical shape and the electrostatic potential, which are represented using the 3D Zernike descriptor (3DZD). We first tested PL-PatchSurfer on binding ligand prediction and found it outperformed the pocket-similarity based ligand prediction program. We then optimized the search algorithm of PL-PatchSurfer using the PDBbind dataset. Finally, we explored the utility of applying PL-PatchSurfer to a larger and more diverse dataset and showed that PL-PatchSurfer was able to provide a high early enrichment for most of the targets. To the best of our knowledge, PL-PatchSurfer is the first surface patch-based method that treats ligand complementarity at protein binding sites. We believe that using a surface patch approach to better understand protein-ligand interactions has the potential to significantly enhance the design of new ligands for a wide array of drug-targets.

  3. Optical switches based on surface plasmons

    International Nuclear Information System (INIS)

    Chen Cong; Wang Pei; Yuan Guanghui; Wang Xiaolei; Min Changjun; Deng Yan; Lu Yonghua; Ming Hai

    2008-01-01

    Great attention is being paid to surface plasmons (SPs) because of their potential applications in sensors, data storage and bio-photonics. Recently, more and more optical switches based on surface plasmon effects have been demonstrated either by simulation or experimentally. This article describes the principles, advantages and disadvantages of various types of optical switches based on SPs, in particular the all-optical switches. (authors)

  4. HuRECA: Human Reliability Evaluator for Computer-based Control Room Actions

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Lee, Seung Jun; Jang, Seung Cheol

    2011-01-01

    As computer-based design features such as computer-based procedures (CBP), soft controls (SCs), and integrated information systems are being adopted in main control rooms (MCR) of nuclear power plants, a human reliability analysis (HRA) method capable of dealing with the effects of these design features on human reliability is needed. From the observations of human factors engineering verification and validation experiments, we have drawn some major important characteristics on operator behaviors and design-related influencing factors (DIFs) from the perspective of human reliability. Firstly, there are new DIFs that should be considered in developing an HRA method for computer-based control rooms including especially CBP and SCs. In the case of the computer-based procedure rather than the paper-based procedure, the structural and managerial elements should be considered as important PSFs in addition to the procedural contents. In the case of the soft controllers, the so-called interface management tasks (or secondary tasks) should be reflected in the assessment of human error probability. Secondly, computer-based control rooms can provide more effective error recovery features than conventional control rooms. Major error recovery features for computer-based control rooms include the automatic logic checking function of the computer-based procedure and the information sharing feature of the general computer-based designs

  5. Summary of theoretical and experimental investigation of grating type, silicon photovoltaic cells. [using p-n junctions on light receiving surface of base crystal

    Science.gov (United States)

    Chen, L. Y.; Loferski, J. J.

    1975-01-01

    Theoretical and experimental aspects are summarized for single crystal, silicon photovoltaic devices made by forming a grating pattern of p/n junctions on the light receiving surface of the base crystal. Based on the general semiconductor equations, a mathematical description is presented for the photovoltaic properties of such grating-like structures in a two dimensional form. The resulting second order elliptical equation is solved by computer modeling to give solutions for various, reasonable, initial values of bulk resistivity, excess carrier concentration, and surface recombination velocity. The validity of the computer model is established by comparison with p/n devices produced by alloying an aluminum grating pattern into the surface of n-type silicon wafers. Current voltage characteristics and spectral response curves are presented for cells of this type constructed on wafers of different resistivities and orientations.

  6. Computational study on the interactions and orientation of monoclonal human immunoglobulin G on a polystyrene surface

    Directory of Open Access Journals (Sweden)

    Javkhlantugs N

    2013-07-01

    Full Text Available Namsrai Javkhlantugs,1,2 Hexig Bayar,3 Chimed Ganzorig,1 Kazuyoshi Ueda2 1Center for Nanoscience and Nanotechnology and Department of Chemical Technology, School of Chemistry and Chemical Engineering, National University of Mongolia, Ulaanbaatar, Mongolia; 2Department of Advanced Materials Chemistry, Graduate School of Engineering, Yokohama National University, Yokohama, Japan; 3The Key Laboratory of Mammalian Reproductive Biology and Biotechnology of the Ministry of Education, Inner Mongolia University, Hohhot, Inner Mongolia Autonomous Region, People's Republic of China Abstract: Having a theoretical understanding of the orientation of immunoglobulin on an immobilized solid surface is important in biomedical pathogen-detecting systems and cellular analysis. Despite the stable adsorption of immunoglobulin on a polystyrene (PS surface that has been applied in many kinds of immunoassays, there are many uncertainties in antibody-based clinical and biological experimental methods. To understand the binding mechanism and physicochemical interactions between immunoglobulin and the PS surface at the atomic level, we investigated the binding behavior and interactions of the monoclonal immunoglobulin G (IgG on the PS surface using the computational method. In our docking simulation with the different arrangement of translational and rotational orientation of IgG onto the PS surface, three typical orientation patterns of the immunoglobulin G on the PS surface were found. We precisely analyzed these orientation patterns and clarified how the immunoglobulin G interacts with the PS surface at atomic scale in the beginning of the adsorption process. Major driving forces for the adsorption of IgG onto the PS surface come from serine (Ser, aspartic acid (Asp, and glutamic acid (Glu residues. Keywords: bionano interface, immunoassay, polystyrene, IgG, physical adsorption, simulation

  7. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  8. CSDMS2.0: Computational Infrastructure for Community Surface Dynamics Modeling

    Science.gov (United States)

    Syvitski, J. P.; Hutton, E.; Peckham, S. D.; Overeem, I.; Kettner, A.

    2012-12-01

    The Community Surface Dynamic Modeling System (CSDMS) is an NSF-supported, international and community-driven program that seeks to transform the science and practice of earth-surface dynamics modeling. CSDMS integrates a diverse community of more than 850 geoscientists representing 360 international institutions (academic, government, industry) from 60 countries and is supported by a CSDMS Interagency Committee (22 Federal agencies), and a CSDMS Industrial Consortia (18 companies). CSDMS presently distributes more 200 Open Source models and modeling tools, access to high performance computing clusters in support of developing and running models, and a suite of products for education and knowledge transfer. CSDMS software architecture employs frameworks and services that convert stand-alone models into flexible "plug-and-play" components to be assembled into larger applications. CSDMS2.0 will support model applications within a web browser, on a wider variety of computational platforms, and on other high performance computing clusters to ensure robustness and sustainability of the framework. Conversion of stand-alone models into "plug-and-play" components will employ automated wrapping tools. Methods for quantifying model uncertainty are being adapted as part of the modeling framework. Benchmarking data is being incorporated into the CSDMS modeling framework to support model inter-comparison. Finally, a robust mechanism for ingesting and utilizing semantic mediation databases is being developed within the Modeling Framework. Six new community initiatives are being pursued: 1) an earth - ecosystem modeling initiative to capture ecosystem dynamics and ensuing interactions with landscapes, 2) a geodynamics initiative to investigate the interplay among climate, geomorphology, and tectonic processes, 3) an Anthropocene modeling initiative, to incorporate mechanistic models of human influences, 4) a coastal vulnerability modeling initiative, with emphasis on deltas and

  9. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  10. Computer-Aided Panoramic Images Enriched by Shadow Construction on a Prism and Pyramid Polyhedral Surface

    Directory of Open Access Journals (Sweden)

    Jolanta Dzwierzynska

    2017-10-01

    Full Text Available The aim of this study is to develop an efficient and practical method of a direct mapping of a panoramic projection on an unfolded prism and pyramid polyhedral projection surface with the aid of a computer. Due to the fact that straight lines very often appear in any architectural form we formulate algorithms which utilize data about lines and draw panoramas as plots of functions in Mathcad software. The ability to draw panoramic images of lines enables drawing a wireframe image of an architectural object. The application of the multicenter projection, as well as the idea of shadow construction in the panoramic representation, aims at achieving a panoramic image close to human perception. The algorithms are universal as the application of changeable base elements of panoramic projection—horizon height, station point location, number of polyhedral walls—enables drawing panoramic images from various viewing positions. However, for more efficient and easier drawing, the algorithms should be implemented in some graphical package. The representation presented in the paper and the method of its direct mapping on a flat unfolded projection surface can find application in the presentation of architectural spaces in advertising and art when drawings are displayed on polyhedral surfaces and can be observed from multiple viewing positions.

  11. Segmentation of 3D ultrasound computer tomography reflection images using edge detection and surface fitting

    Science.gov (United States)

    Hopp, T.; Zapf, M.; Ruiter, N. V.

    2014-03-01

    An essential processing step for comparison of Ultrasound Computer Tomography images to other modalities, as well as for the use in further image processing, is to segment the breast from the background. In this work we present a (semi-) automated 3D segmentation method which is based on the detection of the breast boundary in coronal slice images and a subsequent surface fitting. The method was evaluated using a software phantom and in-vivo data. The fully automatically processed phantom results showed that a segmentation of approx. 10% of the slices of a dataset is sufficient to recover the overall breast shape. Application to 16 in-vivo datasets was performed successfully using semi-automated processing, i.e. using a graphical user interface for manual corrections of the automated breast boundary detection. The processing time for the segmentation of an in-vivo dataset could be significantly reduced by a factor of four compared to a fully manual segmentation. Comparison to manually segmented images identified a smoother surface for the semi-automated segmentation with an average of 11% of differing voxels and an average surface deviation of 2mm. Limitations of the edge detection may be overcome by future updates of the KIT USCT system, allowing a fully-automated usage of our segmentation approach.

  12. Reciprocity in computer-human interaction: source-based, norm-based, and affect-based explanations.

    Science.gov (United States)

    Lee, Seungcheol Austin; Liang, Yuhua Jake

    2015-04-01

    Individuals often apply social rules when they interact with computers, and this is known as the Computers Are Social Actors (CASA) effect. Following previous work, one approach to understand the mechanism responsible for CASA is to utilize computer agents and have the agents attempt to gain human compliance (e.g., completing a pattern recognition task). The current study focuses on three key factors frequently cited to influence traditional notions of compliance: evaluations toward the source (competence and warmth), normative influence (reciprocity), and affective influence (mood). Structural equation modeling assessed the effects of these factors on human compliance with computer request. The final model shows that norm-based influence (reciprocity) increased the likelihood of compliance, while evaluations toward the computer agent did not significantly influence compliance.

  13. Computer vision based room interior design

    Science.gov (United States)

    Ahmad, Nasir; Hussain, Saddam; Ahmad, Kashif; Conci, Nicola

    2015-12-01

    This paper introduces a new application of computer vision. To the best of the author's knowledge, it is the first attempt to incorporate computer vision techniques into room interior designing. The computer vision based interior designing is achieved in two steps: object identification and color assignment. The image segmentation approach is used for the identification of the objects in the room and different color schemes are used for color assignment to these objects. The proposed approach is applied to simple as well as complex images from online sources. The proposed approach not only accelerated the process of interior designing but also made it very efficient by giving multiple alternatives.

  14. Calculation of acoustic field based on laser-measured vibration velocities on ultrasonic transducer surface

    Science.gov (United States)

    Hu, Liang; Zhao, Nannan; Gao, Zhijian; Mao, Kai; Chen, Wenyu; Fu, Xin

    2018-05-01

    Determination of the distribution of a generated acoustic field is valuable for studying ultrasonic transducers, including providing the guidance for transducer design and the basis for analyzing their performance, etc. A method calculating the acoustic field based on laser-measured vibration velocities on the ultrasonic transducer surface is proposed in this paper. Without knowing the inner structure of the transducer, the acoustic field outside it can be calculated by solving the governing partial differential equation (PDE) of the field based on the specified boundary conditions (BCs). In our study, the BC on the transducer surface, i.e. the distribution of the vibration velocity on the surface, is accurately determined by laser scanning measurement of discrete points and follows a data fitting computation. In addition, to ensure the calculation accuracy for the whole field even in an inhomogeneous medium, a finite element method is used to solve the governing PDE based on the mixed BCs, including the discretely measured velocity data and other specified BCs. The method is firstly validated on numerical piezoelectric transducer models. The acoustic pressure distributions generated by a transducer operating in an homogeneous and inhomogeneous medium, respectively, are both calculated by the proposed method and compared with the results from other existing methods. Then, the method is further experimentally validated with two actual ultrasonic transducers used for flow measurement in our lab. The amplitude change of the output voltage signal from the receiver transducer due to changing the relative position of the two transducers is calculated by the proposed method and compared with the experimental data. This method can also provide the basis for complex multi-physical coupling computations where the effect of the acoustic field should be taken into account.

  15. Kinetic computer modeling of microwave surface-wave plasma production

    International Nuclear Information System (INIS)

    Ganachev, Ivan P.

    2004-01-01

    Kinetic computer plasma modeling occupies an intermediate position between the time consuming rigorous particle dynamic simulation and the fast but rather rough cold- or warm-plasma fluid models. The present paper reviews the kinetic modeling of microwave surface-wave discharges with accent on recent kinetic self-consistent models, where the external input parameters are reduced to the necessary minimum (frequency and intensity of the applied microwave field and pressure and geometry of the discharge vessel). The presentation is limited to low pressures, so that Boltzmann equation is solved in non-local approximation and collisional electron heating is neglected. The numerical results reproduce correctly the bi-Maxwellian electron energy distribution functions observed experimentally. (author)

  16. Agent-Based Computing: Promise and Perils

    OpenAIRE

    Jennings, N. R.

    1999-01-01

    Agent-based computing represents an exciting new synthesis both for Artificial Intelligence (AI) and, more genrally, Computer Science. It has the potential to significantly improve the theory and practice of modelling, designing and implementing complex systems. Yet, to date, there has been little systematic analysis of what makes an agent such an appealing and powerful conceptual model. Moreover, even less effort has been devoted to exploring the inherent disadvantages that stem from adoptin...

  17. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  18. Computer Series, 115.

    Science.gov (United States)

    Birk, James P., Ed.

    1990-01-01

    Reviewed are six computer programs which may be useful in teaching college level chemistry. Topics include dynamic data storage in FORTRAN, "KC?DISCOVERER," pH of acids and bases, calculating percent boundary surfaces for orbitals, and laboratory interfacing with PT Nomograph for the Macintosh. (CW)

  19. Surface processing with ionized cluster beams: computer simulation

    International Nuclear Information System (INIS)

    Insepov, Z.; Yamada, I.

    1999-01-01

    Molecular Dynamics (MD) and Monte Carlo (MC) models of energetic gas cluster irradiation of a solid surface have been developed to investigate the phenomena of crater formation, sputtering, surface treatment, and the material hardness evaluation by irradiation with cluster ions. Theoretical estimation of crater dimensions formed with Ar gas cluster ion irradiation of different substrates, based on hydrodynamics and MD simulation, are presented. The atomic scale shock waves arising from cluster impact were obtained by calculating the pressure, temperature and mass-velocity of the target atoms. The crater depth is given as a unique 1/3 dependence on the cluster energy and on the cold material Brinell hardness number (BHN). A new 'true material hardness' scale which can be very useful for example for thin film coatings deposited on a soft substrate, is defined. This finding could be used as a new technique for measuring of a material hardness. Evolution of surface morphology under cluster ion irradiation was described by the surface relaxation equation which contains a term of crater formation at cluster impact. The formation of ripples on a surface irradiated with oblique cluster ion beams was predicted. MD and MC models of Decaborane ion (B 10 H 14 ) implantation into Si and the following rapid thermal annealing (RTA) have been developed

  20. Nonlocal continuum-based modeling of breathing mode of nanowires including surface stress and surface inertia effects

    Science.gov (United States)

    Ghavanloo, Esmaeal; Fazelzadeh, S. Ahmad; Rafii-Tabar, Hashem

    2014-05-01

    Nonlocal and surface effects significantly influence the mechanical response of nanomaterials and nanostructures. In this work, the breathing mode of a circular nanowire is studied on the basis of the nonlocal continuum model. Both the surface elastic properties and surface inertia effect are included. Nanowires can be modeled as long cylindrical solid objects. The classical model is reformulated using the nonlocal differential constitutive relations of Eringen and Gurtin-Murdoch surface continuum elasticity formalism. A new frequency equation for the breathing mode of nanowires, including small scale effect, surface stress and surface inertia is presented by employing the Bessel functions. Numerical results are computed, and are compared to confirm the validity and accuracy of the proposed method. Furthermore, the model is used to elucidate the effect of nonlocal parameter, the surface stress, the surface inertia and the nanowire orientation on the breathing mode of several types of nanowires with size ranging from 0.5 to 4 nm. Our results reveal that the combined surface and small scale effects are significant for nanowires with diameter smaller than 4 nm.

  1. Nonlocal continuum-based modeling of breathing mode of nanowires including surface stress and surface inertia effects

    International Nuclear Information System (INIS)

    Ghavanloo, Esmaeal; Fazelzadeh, S. Ahmad; Rafii-Tabar, Hashem

    2014-01-01

    Nonlocal and surface effects significantly influence the mechanical response of nanomaterials and nanostructures. In this work, the breathing mode of a circular nanowire is studied on the basis of the nonlocal continuum model. Both the surface elastic properties and surface inertia effect are included. Nanowires can be modeled as long cylindrical solid objects. The classical model is reformulated using the nonlocal differential constitutive relations of Eringen and Gurtin–Murdoch surface continuum elasticity formalism. A new frequency equation for the breathing mode of nanowires, including small scale effect, surface stress and surface inertia is presented by employing the Bessel functions. Numerical results are computed, and are compared to confirm the validity and accuracy of the proposed method. Furthermore, the model is used to elucidate the effect of nonlocal parameter, the surface stress, the surface inertia and the nanowire orientation on the breathing mode of several types of nanowires with size ranging from 0.5 to 4 nm. Our results reveal that the combined surface and small scale effects are significant for nanowires with diameter smaller than 4 nm.

  2. Nonlocal continuum-based modeling of breathing mode of nanowires including surface stress and surface inertia effects

    Energy Technology Data Exchange (ETDEWEB)

    Ghavanloo, Esmaeal, E-mail: ghavanloo@shirazu.ac.ir [School of Mechanical Engineering, Shiraz University, Shiraz 71963-16548 (Iran, Islamic Republic of); Fazelzadeh, S. Ahmad [School of Mechanical Engineering, Shiraz University, Shiraz 71963-16548 (Iran, Islamic Republic of); Rafii-Tabar, Hashem [Department of Medical Physics and Biomedical Engineering, Research Center for Medical Nanotechnology and Tissue Engineering, Shahid Beheshti University of Medical Sciences, Evin, Tehran (Iran, Islamic Republic of); Computational Physical Sciences Research Laboratory, School of Nano-Science, Institute for Research in Fundamental Sciences (IPM), Tehran (Iran, Islamic Republic of)

    2014-05-01

    Nonlocal and surface effects significantly influence the mechanical response of nanomaterials and nanostructures. In this work, the breathing mode of a circular nanowire is studied on the basis of the nonlocal continuum model. Both the surface elastic properties and surface inertia effect are included. Nanowires can be modeled as long cylindrical solid objects. The classical model is reformulated using the nonlocal differential constitutive relations of Eringen and Gurtin–Murdoch surface continuum elasticity formalism. A new frequency equation for the breathing mode of nanowires, including small scale effect, surface stress and surface inertia is presented by employing the Bessel functions. Numerical results are computed, and are compared to confirm the validity and accuracy of the proposed method. Furthermore, the model is used to elucidate the effect of nonlocal parameter, the surface stress, the surface inertia and the nanowire orientation on the breathing mode of several types of nanowires with size ranging from 0.5 to 4 nm. Our results reveal that the combined surface and small scale effects are significant for nanowires with diameter smaller than 4 nm.

  3. Surface-based prostate registration with biomechanical regularization

    Science.gov (United States)

    van de Ven, Wendy J. M.; Hu, Yipeng; Barentsz, Jelle O.; Karssemeijer, Nico; Barratt, Dean; Huisman, Henkjan J.

    2013-03-01

    Adding MR-derived information to standard transrectal ultrasound (TRUS) images for guiding prostate biopsy is of substantial clinical interest. A tumor visible on MR images can be projected on ultrasound by using MRUS registration. A common approach is to use surface-based registration. We hypothesize that biomechanical modeling will better control deformation inside the prostate than a regular surface-based registration method. We developed a novel method by extending a surface-based registration with finite element (FE) simulation to better predict internal deformation of the prostate. For each of six patients, a tetrahedral mesh was constructed from the manual prostate segmentation. Next, the internal prostate deformation was simulated using the derived radial surface displacement as boundary condition. The deformation field within the gland was calculated using the predicted FE node displacements and thin-plate spline interpolation. We tested our method on MR guided MR biopsy imaging data, as landmarks can easily be identified on MR images. For evaluation of the registration accuracy we used 45 anatomical landmarks located in all regions of the prostate. Our results show that the median target registration error of a surface-based registration with biomechanical regularization is 1.88 mm, which is significantly different from 2.61 mm without biomechanical regularization. We can conclude that biomechanical FE modeling has the potential to improve the accuracy of multimodal prostate registration when comparing it to regular surface-based registration.

  4. A Variance Distribution Model of Surface EMG Signals Based on Inverse Gamma Distribution.

    Science.gov (United States)

    Hayashi, Hideaki; Furui, Akira; Kurita, Yuichi; Tsuji, Toshio

    2017-11-01

    Objective: This paper describes the formulation of a surface electromyogram (EMG) model capable of representing the variance distribution of EMG signals. Methods: In the model, EMG signals are handled based on a Gaussian white noise process with a mean of zero for each variance value. EMG signal variance is taken as a random variable that follows inverse gamma distribution, allowing the representation of noise superimposed onto this variance. Variance distribution estimation based on marginal likelihood maximization is also outlined in this paper. The procedure can be approximated using rectified and smoothed EMG signals, thereby allowing the determination of distribution parameters in real time at low computational cost. Results: A simulation experiment was performed to evaluate the accuracy of distribution estimation using artificially generated EMG signals, with results demonstrating that the proposed model's accuracy is higher than that of maximum-likelihood-based estimation. Analysis of variance distribution using real EMG data also suggested a relationship between variance distribution and signal-dependent noise. Conclusion: The study reported here was conducted to examine the performance of a proposed surface EMG model capable of representing variance distribution and a related distribution parameter estimation method. Experiments using artificial and real EMG data demonstrated the validity of the model. Significance: Variance distribution estimated using the proposed model exhibits potential in the estimation of muscle force. Objective: This paper describes the formulation of a surface electromyogram (EMG) model capable of representing the variance distribution of EMG signals. Methods: In the model, EMG signals are handled based on a Gaussian white noise process with a mean of zero for each variance value. EMG signal variance is taken as a random variable that follows inverse gamma distribution, allowing the representation of noise superimposed onto this

  5. 26 CFR 1.809-10 - Computation of equity base.

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 8 2010-04-01 2010-04-01 false Computation of equity base. 1.809-10 Section 1... (CONTINUED) INCOME TAXES Gain and Loss from Operations § 1.809-10 Computation of equity base. (a) In general. For purposes of section 809, the equity base of a life insurance company includes the amount of any...

  6. [Computer aided design for fixed partial denture framework based on reverse engineering technology].

    Science.gov (United States)

    Sun, Yu-chun; Lü, Pei-jun; Wang, Yong

    2006-03-01

    To explore a computer aided design (CAD) route for the framework of domestic fixed partial denture (FPD) and confirm the suitable method of 3-D CAD. The working area of a dentition model was scanned with a 3-D mechanical scanner. Using the reverse engineering (RE) software, margin and border curves were extracted and several reference curves were created to ensure the dimension and location of pontic framework that was taken from the standard database. The shoulder parts of the retainers were created after axial surfaces constructed. The connecting areas, axial line and curving surface of the framework connector were finally created. The framework of a three-unit FPD was designed with RE technology, which showed smooth surfaces and continuous contours. The design route is practical. The result of this study is significant in theory and practice, which will provide a reference for establishing the computer aided design/computer aided manufacture (CAD/CAM) system of domestic FPD.

  7. Enhancing Lecture Presentations in Introductory Biology with Computer-Based Multimedia.

    Science.gov (United States)

    Fifield, Steve; Peifer, Rick

    1994-01-01

    Uses illustrations and text to discuss convenient ways to organize and present computer-based multimedia to students in lecture classes. Includes the following topics: (1) Effects of illustrations on learning; (2) Using computer-based illustrations in lecture; (3) MacPresents-Multimedia Presentation Software; (4) Advantages of computer-based…

  8. Remote media vision-based computer input device

    Science.gov (United States)

    Arabnia, Hamid R.; Chen, Ching-Yi

    1991-11-01

    In this paper, we introduce a vision-based computer input device which has been built at the University of Georgia. The user of this system gives commands to the computer without touching any physical device. The system receives input through a CCD camera; it is PC- based and is built on top of the DOS operating system. The major components of the input device are: a monitor, an image capturing board, a CCD camera, and some software (developed by use). These are interfaced with a standard PC running under the DOS operating system.

  9. Inversion based on computational simulations

    International Nuclear Information System (INIS)

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-01-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal

  10. GEOSURF: a computer program for modeling adsorption on mineral surfaces from aqueous solution

    Science.gov (United States)

    Sahai, Nita; Sverjensky, Dimitri A.

    1998-11-01

    A new program, GEOSURF, has been developed for calculating aqueous and surface speciation consistent with the triple-layer model of surface complexation. GEOSURF is an extension of the original programs MINEQL, MICROQL and HYDRAQL. We present, here, the basic algorithm of GEOSURF along with a description of the new features implemented. GEOSURF is linked to internally consistent data bases for surface species (SURFK.DAT) and for aqueous species (AQSOL.DAT). SURFK.DAT contains properties of minerals such as site densities, and equilibrium constants for adsorption of aqueous protons and electrolyte ions on a variety of oxides and hydroxides. The Helgeson, Kirkham and Flowers version of the extended Debye-Huckel Equation for 1:1 electrolytes is implemented for calculating aqueous activity coefficients. This permits the calculation of speciation at ionic strengths greater than 0.5 M. The activity of water is computed explicitly from the osmotic coefficient of the solution, and the total amount of electrolyte cation (or anion) is adjusted to satisfy the electroneutrality condition. Finally, the use of standard symbols for chemical species rather than species identification numbers is included to facilitate use of the program. One of the main limitations of GEOSURF is that aqueous and surface speciation can only be calculated at fixed pH and at fixed concentration of total adsorbate. Thus, the program cannot perform reaction-path calculations: it cannot determine whether or not a solution is over- or under-saturated with respect to one or more solid phases. To check the proper running of GEOSURF, we have compared results generated by GEOSURF with those from two other programs, HYDRAQL and EQ3. The Davies equation and the "bdot" equation, respectively, are used in the latter two programs for calculating aqueous activity coefficients. An example of the model fit to experimental data for rutile in 0.001 M-2.0 M NaNO 3 is included.

  11. An Effective Approach of Teeth Segmentation within the 3D Cone Beam Computed Tomography Image Based on Deformable Surface Model

    Directory of Open Access Journals (Sweden)

    Xutang Zhang

    2016-01-01

    Full Text Available In order to extract the pixels of teeth from 3D Cone Beam Computed Tomography (CBCT image, in this paper, a novel 3D segmentation approach based on deformable surface mode is developed for 3D tooth model reconstruction. Different forces are formulated to handle the segmentation problem by using different strategies. First, the proposed method estimates the deformation force of vertex model by simulating the deformation process of a bubble under the action of internal pressure and external force field. To handle the blurry boundary, a “braking force” is proposed deriving from the 3D gradient information calculated by transforming the Sobel operator into three-dimension representation. In addition, a “border reinforcement” strategy is developed for handling the cases with complicate structures. Moreover, the proposed method combines affine cell image decomposition (ACID grid reparameterization technique to handle the unstable changes of topological structure and deformability during the deformation process. The proposed method was performed on 510 CBCT images. To validate the performance, the results were compared with those of two other well-studied methods. Experimental results show that the proposed approach had a good performance in handling the cases with complicate structures and blurry boundaries well, is effective to converge, and can successfully achieve the reconstruction task of various types of teeth in oral cavity.

  12. A quantum computer based on recombination processes in microelectronic devices

    International Nuclear Information System (INIS)

    Theodoropoulos, K; Ntalaperas, D; Petras, I; Konofaos, N

    2005-01-01

    In this paper a quantum computer based on the recombination processes happening in semiconductor devices is presented. A 'data element' and a 'computational element' are derived based on Schokley-Read-Hall statistics and they can later be used to manifest a simple and known quantum computing process. Such a paradigm is shown by the application of the proposed computer onto a well known physical system involving traps in semiconductor devices

  13. "Transit data"-based MST computation

    Directory of Open Access Journals (Sweden)

    Thodoris Karatasos

    2017-10-01

    Full Text Available In this work, we present an innovative image recognition technique which is based on the exploitation of transit-data in images or simple photographs of sites of interest. Our objective is to automatically transform real-world images to graphs and, then, compute Minimum Spanning Trees (MST in them.We apply this framework and present an application which automatically computes efficient construction plans (for escalator or low-emission hot spots for connecting all points of interest in cultural sites, i.e., archaeological sites, museums, galleries, etc, aiming to to facilitate global physical access to cultural heritage and artistic work and make it accessible to all groups of population.

  14. Standardized computer-based organized reporting of EEG:SCORE

    DEFF Research Database (Denmark)

    Beniczky, Sandor; H, Aurlien,; JC, Brøgger,

    2013-01-01

    process, organized by the European Chapter of the International Federation of Clinical Neurophysiology. The Standardised Computer-based Organised Reporting of EEG (SCORE) software was constructed based on the terms and features of the consensus statement and it was tested in the clinical practice...... in free-text format. The purpose of our endeavor was to create a computer-based system for EEG assessment and reporting, where the physicians would construct the reports by choosing from predefined elements for each relevant EEG feature, as well as the clinical phenomena (for video-EEG recordings....... SCORE can potentially improve the quality of EEG assessment and reporting; it will help incorporate the results of computer-assisted analysis into the report, it will make possible the build-up of a multinational database, and it will help in training young neurophysiologists....

  15. Three-dimensional measurement of small inner surface profiles using feature-based 3-D panoramic registration

    Science.gov (United States)

    Gong, Yuanzheng; Seibel, Eric J.

    2017-01-01

    Rapid development in the performance of sophisticated optical components, digital image sensors, and computer abilities along with decreasing costs has enabled three-dimensional (3-D) optical measurement to replace more traditional methods in manufacturing and quality control. The advantages of 3-D optical measurement, such as noncontact, high accuracy, rapid operation, and the ability for automation, are extremely valuable for inline manufacturing. However, most of the current optical approaches are eligible for exterior instead of internal surfaces of machined parts. A 3-D optical measurement approach is proposed based on machine vision for the 3-D profile measurement of tiny complex internal surfaces, such as internally threaded holes. To capture the full topographic extent (peak to valley) of threads, a side-view commercial rigid scope is used to collect images at known camera positions and orientations. A 3-D point cloud is generated with multiview stereo vision using linear motion of the test piece, which is repeated by a rotation to form additional point clouds. Registration of these point clouds into a complete reconstruction uses a proposed automated feature-based 3-D registration algorithm. The resulting 3-D reconstruction is compared with x-ray computed tomography to validate the feasibility of our proposed method for future robotically driven industrial 3-D inspection.

  16. Computer Based Road Accident Reconstruction Experiences

    Directory of Open Access Journals (Sweden)

    Milan Batista

    2005-03-01

    Full Text Available Since road accident analyses and reconstructions are increasinglybased on specific computer software for simulationof vehicle d1iving dynamics and collision dynamics, and forsimulation of a set of trial runs from which the model that bestdescribes a real event can be selected, the paper presents anoverview of some computer software and methods available toaccident reconstruction experts. Besides being time-saving,when properly used such computer software can provide moreauthentic and more trustworthy accident reconstruction, thereforepractical experiences while using computer software toolsfor road accident reconstruction obtained in the TransportSafety Laboratory at the Faculty for Maritime Studies andTransport of the University of Ljubljana are presented and discussed.This paper addresses also software technology for extractingmaximum information from the accident photo-documentationto support accident reconstruction based on the simulationsoftware, as well as the field work of reconstruction expertsor police on the road accident scene defined by this technology.

  17. Standardized computer-based organized reporting of EEG

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C.

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE....... In the end, the diagnostic significance is scored, using a standardized list of terms. SCORE has specific modules for scoring seizures (including seizure semiology and ictal EEG patterns), neonatal recordings (including features specific for this age group), and for Critical Care EEG Terminology. SCORE...

  18. Standardized computer-based organized reporting of EEG

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C.

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE...... are used to report the features of clinical relevance, extracted while assessing the EEGs. Selection of the terms is context sensitive: initial choices determine the subsequently presented sets of additional choices. This process automatically generates a report and feeds these features into a database...

  19. Graphics processing unit based computation for NDE applications

    Science.gov (United States)

    Nahas, C. A.; Rajagopal, Prabhu; Balasubramaniam, Krishnan; Krishnamurthy, C. V.

    2012-05-01

    Advances in parallel processing in recent years are helping to improve the cost of numerical simulation. Breakthroughs in Graphical Processing Unit (GPU) based computation now offer the prospect of further drastic improvements. The introduction of 'compute unified device architecture' (CUDA) by NVIDIA (the global technology company based in Santa Clara, California, USA) has made programming GPUs for general purpose computing accessible to the average programmer. Here we use CUDA to develop parallel finite difference schemes as applicable to two problems of interest to NDE community, namely heat diffusion and elastic wave propagation. The implementations are for two-dimensions. Performance improvement of the GPU implementation against serial CPU implementation is then discussed.

  20. A personal computer-based nuclear magnetic resonance spectrometer

    Science.gov (United States)

    Job, Constantin; Pearson, Robert M.; Brown, Michael F.

    1994-11-01

    Nuclear magnetic resonance (NMR) spectroscopy using personal computer-based hardware has the potential of enabling the application of NMR methods to fields where conventional state of the art equipment is either impractical or too costly. With such a strategy for data acquisition and processing, disciplines including civil engineering, agriculture, geology, archaeology, and others have the possibility of utilizing magnetic resonance techniques within the laboratory or conducting applications directly in the field. Another aspect is the possibility of utilizing existing NMR magnets which may be in good condition but unused because of outdated or nonrepairable electronics. Moreover, NMR applications based on personal computer technology may open up teaching possibilities at the college or even secondary school level. The goal of developing such a personal computer (PC)-based NMR standard is facilitated by existing technologies including logic cell arrays, direct digital frequency synthesis, use of PC-based electrical engineering software tools to fabricate electronic circuits, and the use of permanent magnets based on neodymium-iron-boron alloy. Utilizing such an approach, we have been able to place essentially an entire NMR spectrometer console on two printed circuit boards, with the exception of the receiver and radio frequency power amplifier. Future upgrades to include the deuterium lock and the decoupler unit are readily envisioned. The continued development of such PC-based NMR spectrometers is expected to benefit from the fast growing, practical, and low cost personal computer market.

  1. Comparability of Computer-based and Paper-based Versions of Writing Section of PET in Iranian EFL Context

    Directory of Open Access Journals (Sweden)

    Mohammad Mohammadi

    2010-11-01

    Full Text Available Computer technology has provided language testing experts with opportunity to develop computerized versions of traditional paper-based language tests. New generations of TOEFL and Cambridge IELTS, BULATS, KET, PET are good examples of computer-based language tests. Since this new method of testing introduces new factors into the realm of language assessment ( e.g. modes of test delivery, familiarity with computer, etc.,the question may be whether the two modes of computer- and paper-based tests comparably measure the same construct, and hence, the scores obtained from the two modes can be used interchangeably. Accordingly, the present study aimed to investigate the comparability of the paper- and computer-based versions of a writing test. The data for this study were collected from administering the writing section of a Cambridge Preliminary English Test (PET to eighty Iranian intermediate EFL learners through the two modes of computer- and paper-based testing. Besides, a computer familiarity questionnaire was used to divide participants into two groups with high and low computer familiarity. The results of the independent samples t-test revealed that there was no statistically significant difference between the learners' computer- and paper-based writing scores. The results of the paired samples t-test showed no statistically significant difference between high- and low-computer-familiar groups on computer-based writing. The researchers concluded that the two modes comparably measured the same construct.

  2. Computed a multiple band metamaterial absorber and its application based on the figure of merit value

    Science.gov (United States)

    Chen, Chao; Sheng, Yuping; Jun, Wang

    2018-01-01

    A high performed multiple band metamaterial absorber is designed and computed through the software Ansofts HFSS 10.0, which is constituted with two kinds of separated metal particles sub-structures. The multiple band absorption property of the metamaterial absorber is based on the resonance of localized surface plasmon (LSP) modes excited near edges of metal particles. The damping constant of gold layer is optimized to obtain a near-perfect absorption rate. Four kinds of dielectric layers is computed to achieve the perfect absorption perform. The perfect absorption perform of the metamaterial absorber is enhanced through optimizing the structural parameters (R = 75 nm, w = 80 nm). Moreover, a perfect absorption band is achieved because of the plasmonic hybridization phenomenon between LSP modes. The designed metamaterial absorber shows high sensitive in the changed of the refractive index of the liquid. A liquid refractive index sensor strategy is proposed based on the computed figure of merit (FOM) value of the metamaterial absorber. High FOM values (116, 111, and 108) are achieved with three liquid (Methanol, Carbon tetrachloride, and Carbon disulfide).

  3. An Intelligent Method for Structural Reliability Analysis Based on Response Surface

    Institute of Scientific and Technical Information of China (English)

    桂劲松; 刘红; 康海贵

    2004-01-01

    As water depth increases, the structural safety and reliability of a system become more and more important and challenging. Therefore, the structural reliability method must be applied in ocean engineering design such as offshore platform design. If the performance function is known in structural reliability analysis, the first-order second-moment method is often used. If the performance function could not be definitely expressed, the response surface method is always used because it has a very clear train of thought and simple programming. However, the traditional response surface method fits the response surface of quadratic polynomials where the problem of accuracy could not be solved, because the true limit state surface can be fitted well only in the area near the checking point. In this paper, an intelligent computing method based on the whole response surface is proposed, which can be used for the situation where the performance function could not be definitely expressed in structural reliability analysis. In this method, a response surface of the fuzzy neural network for the whole area should be constructed first, and then the structural reliability can be calculated by the genetic algorithm. In the proposed method, all the sample points for the training network come from the whole area, so the true limit state surface in the whole area can be fitted. Through calculational examples and comparative analysis, it can be known that the proposed method is much better than the traditional response surface method of quadratic polynomials, because, the amount of calculation of finite element analysis is largely reduced, the accuracy of calculation is improved,and the true limit state surface can be fitted very well in the whole area. So, the method proposed in this paper is suitable for engineering application.

  4. A computer simulation of the surface channeling of MeV heavy charged particles

    International Nuclear Information System (INIS)

    Morita, K.

    1980-01-01

    The surface channeling of 1.5 MeV N + ions incident near the [011] direction on the (100) surface and near the [001] direction on the (110) surface of Ge crystals has been studied using computer simulation. The trajectories of ions incident at angles near the critical angle for axial channeling were traced. The energy spectra, the angular distributions and the reflection-depth distributions of scattered ions were obtained. The calculated energy spectra for both directions are found to be composed of a surface peak and a broad peak, the latter being at the low energy side of the surface peak. The height of the surface peak and the energy position of the broad peak are found to depend on the azimuthal component and the tilt component of the incident angle, respectively. This result is explained to be due to the focusing effect of channeled ions deflected by the atomic rows at the surface. It is shown that the calculated angular distributions of scattered ions form a half-ring pattern and clear dips appear in the scattering intensity curve along the half-ring. The dips are found to be caused by the blocking for scattered ions by the atomic rows arrayed in the major planar directions. (author)

  5. APC: A new code for Atmospheric Polarization Computations

    International Nuclear Information System (INIS)

    Korkin, Sergey V.; Lyapustin, Alexei I.; Rozanov, Vladimir V.

    2013-01-01

    A new polarized radiative transfer code Atmospheric Polarization Computations (APC) is described. The code is based on separation of the diffuse light field into anisotropic and smooth (regular) parts. The anisotropic part is computed analytically. The smooth regular part is computed numerically using the discrete ordinates method. Vertical stratification of the atmosphere, common types of bidirectional surface reflection and scattering by spherical particles or spheroids are included. A particular consideration is given to computation of the bidirectional polarization distribution function (BPDF) of the waved ocean surface. -- Highlights: •A new code, APC, has been developed. •The code was validated against well-known codes. •The BPDF for an arbitrary Mueller matrix is computed

  6. A method for sensible heat flux model parameterization based on radiometric surface temperature and environmental factors without involving the parameter KB-1

    Science.gov (United States)

    Zhuang, Qifeng; Wu, Bingfang; Yan, Nana; Zhu, Weiwei; Xing, Qiang

    2016-05-01

    Sensible heat flux is a key component of land-atmosphere interaction. In most parameterizations it is calculated with surface-air temperature differences and total aerodynamic resistance to heat transfer (Rae) that is related to the KB-1 parameter. Suitable values are hard to obtain since KB-1 is related both to canopy characteristics and environmental conditions. In this paper, a parameterize method for sensible heat flux over vegetated surfaces (maize field and grass land in the Heihe river basin of northwest China) was proposed based on the radiometric surface temperature, surface resistance (Rs) and vapor pressures (saturated and actual) at the surface and the atmosphere above the canopy. A biophysics-based surface resistance model was revised to compute surface resistance with several environmental factors. The total aerodynamic resistance to heat transfer is directly calculated by combining the biophysics-based surface resistance and vapor pressures. One merit of this method is that the calculation of KB-1 can be avoided. The method provides a new way to estimate sensible heat flux over vegetated surfaces and its performance compares well to the LAS measured sensible heat and other empirical or semi-empirical KB-1 based estimations.

  7. Development of processing procedure preparing for digital computer controlled equipment on modular design base

    International Nuclear Information System (INIS)

    Starosel'tsev, O.P.; Khrundin, V.I.

    1982-01-01

    In order to reduce labour consumption of technological preparation of production for digital computer controlled machines during the treatment of steam turbines articles created is a system of modular design of technological processes and controlling programs. A set of typical modulas-transitions, being a number of surfaces of an articles treated with one cutting tool in optimum sequence, and a library of cutting tools are the base of the system. Introduction of such a system sharply enhaneces the efficiency of the equipment utilization [ru

  8. Soil Erosion Estimation Using Grid-based Computation

    Directory of Open Access Journals (Sweden)

    Josef Vlasák

    2005-06-01

    Full Text Available Soil erosion estimation is an important part of a land consolidation process. Universal soil loss equation (USLE was presented by Wischmeier and Smith. USLE computation uses several factors, namely R – rainfall factor, K – soil erodability, L – slope length factor, S – slope gradient factor, C – cropping management factor, and P – erosion control management factor. L and S factors are usually combined to one LS factor – Topographic factor. The single factors are determined from several sources, such as DTM (Digital Terrain Model, BPEJ – soil type map, aerial and satellite images, etc. A conventional approach to the USLE computation, which is widely used in the Czech Republic, is based on the selection of characteristic profiles for which all above-mentioned factors must be determined. The result (G – annual soil loss of such computation is then applied for a whole area (slope of interest. Another approach to the USLE computation uses grids as a main data-structure. A prerequisite for a grid-based USLE computation is that each of the above-mentioned factors exists as a separate grid layer. The crucial step in this computation is a selection of appropriate grid resolution (grid cell size. A large cell size can cause an undesirable precision degradation. Too small cell size can noticeably slow down the whole computation. Provided that the cell size is derived from the source’s precision, the appropriate cell size for the Czech Republic varies from 30m to 50m. In some cases, especially when new surveying was done, grid computations can be performed with higher accuracy, i.e. with a smaller grid cell size. In such case, we have proposed a new method using the two-step computation. The first step computation uses a bigger cell size and is designed to identify higher erosion spots. The second step then uses a smaller cell size but it make the computation only the area identified in the previous step. This decomposition allows a

  9. Improvements of top-of-atmosphere and surface irradiance computations with CALIPSO-, CloudSat-, and MODIS-derived cloud and aerosol properties

    Science.gov (United States)

    Kato, Seiji; Rose, Fred G.; Sun-Mack, Sunny; Miller, Walter F.; Chen, Yan; Rutan, David A.; Stephens, Graeme L.; Loeb, Norman G.; Minnis, Patrick; Wielicki, Bruce A.; Winker, David M.; Charlock, Thomas P.; Stackhouse, Paul W., Jr.; Xu, Kuan-Man; Collins, William D.

    2011-10-01

    One year of instantaneous top-of-atmosphere (TOA) and surface shortwave and longwave irradiances are computed using cloud and aerosol properties derived from instruments on the A-Train Constellation: the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) on the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) satellite, the CloudSat Cloud Profiling Radar (CPR), and the Aqua Moderate Resolution Imaging Spectrometer (MODIS). When modeled irradiances are compared with those computed with cloud properties derived from MODIS radiances by a Clouds and the Earth's Radiant Energy System (CERES) cloud algorithm, the global and annual mean of modeled instantaneous TOA irradiances decreases by 12.5 W m-2 (5.0%) for reflected shortwave and 2.5 W m-2 (1.1%) for longwave irradiances. As a result, the global annual mean of instantaneous TOA irradiances agrees better with CERES-derived irradiances to within 0.5W m-2 (out of 237.8 W m-2) for reflected shortwave and 2.6W m-2 (out of 240.1 W m-2) for longwave irradiances. In addition, the global annual mean of instantaneous surface downward longwave irradiances increases by 3.6 W m-2 (1.0%) when CALIOP- and CPR-derived cloud properties are used. The global annual mean of instantaneous surface downward shortwave irradiances also increases by 8.6 W m-2 (1.6%), indicating that the net surface irradiance increases when CALIOP- and CPR-derived cloud properties are used. Increasing the surface downward longwave irradiance is caused by larger cloud fractions (the global annual mean by 0.11, 0.04 excluding clouds with optical thickness less than 0.3) and lower cloud base heights (the global annual mean by 1.6 km). The increase of the surface downward longwave irradiance in the Arctic exceeds 10 W m-2 (˜4%) in winter because CALIOP and CPR detect more clouds in comparison with the cloud detection by the CERES cloud algorithm during polar night. The global annual mean surface downward longwave irradiance of

  10. Big data mining analysis method based on cloud computing

    Science.gov (United States)

    Cai, Qing Qiu; Cui, Hong Gang; Tang, Hao

    2017-08-01

    Information explosion era, large data super-large, discrete and non-(semi) structured features have gone far beyond the traditional data management can carry the scope of the way. With the arrival of the cloud computing era, cloud computing provides a new technical way to analyze the massive data mining, which can effectively solve the problem that the traditional data mining method cannot adapt to massive data mining. This paper introduces the meaning and characteristics of cloud computing, analyzes the advantages of using cloud computing technology to realize data mining, designs the mining algorithm of association rules based on MapReduce parallel processing architecture, and carries out the experimental verification. The algorithm of parallel association rule mining based on cloud computing platform can greatly improve the execution speed of data mining.

  11. Performance-based gear metrology kinematic, transmission, error computation and diagnosis

    CERN Document Server

    Mark, William D

    2012-01-01

    A mathematically rigorous explanation of how manufacturing deviations and damage on the working surfaces of gear teeth cause transmission-error contributions to vibration excitations Some gear-tooth working-surface manufacturing deviations of significant amplitude cause negligible vibration excitation and noise, yet others of minuscule amplitude are a source of significant vibration excitation and noise.   Presently available computer-numerically-controlled dedicated gear metrology equipment can measure such error patterns on a gear in a few hours in sufficient detail to enable

  12. Curves and surfaces for computer-aided geometric design a practical guide

    CERN Document Server

    Farin, Gerald

    1992-01-01

    A leading expert in CAGD, Gerald Farin covers the representation, manipulation, and evaluation of geometric shapes in this the Third Edition of Curves and Surfaces for Computer Aided Geometric Design. The book offers an introduction to the field that emphasizes Bernstein-Bezier methods and presents subjects in an informal, readable style, making this an ideal text for an introductory course at the advanced undergraduate or graduate level.The Third Edition includes a new chapter on Topology, offers new exercises and sections within most chapters, combines the material on Geometric Continuity i

  13. Impedance computations and beam-based measurements: A problem of discrepancy

    Science.gov (United States)

    Smaluk, Victor

    2018-04-01

    High intensity of particle beams is crucial for high-performance operation of modern electron-positron storage rings, both colliders and light sources. The beam intensity is limited by the interaction of the beam with self-induced electromagnetic fields (wake fields) proportional to the vacuum chamber impedance. For a new accelerator project, the total broadband impedance is computed by element-wise wake-field simulations using computer codes. For a machine in operation, the impedance can be measured experimentally using beam-based techniques. In this article, a comparative analysis of impedance computations and beam-based measurements is presented for 15 electron-positron storage rings. The measured data and the predictions based on the computed impedance budgets show a significant discrepancy. Three possible reasons for the discrepancy are discussed: interference of the wake fields excited by a beam in adjacent components of the vacuum chamber, effect of computation mesh size, and effect of insufficient bandwidth of the computed impedance.

  14. CASTp 3.0: computed atlas of surface topography of proteins.

    Science.gov (United States)

    Tian, Wei; Chen, Chang; Lei, Xue; Zhao, Jieling; Liang, Jie

    2018-06-01

    Geometric and topological properties of protein structures, including surface pockets, interior cavities and cross channels, are of fundamental importance for proteins to carry out their functions. Computed Atlas of Surface Topography of proteins (CASTp) is a web server that provides online services for locating, delineating and measuring these geometric and topological properties of protein structures. It has been widely used since its inception in 2003. In this article, we present the latest version of the web server, CASTp 3.0. CASTp 3.0 continues to provide reliable and comprehensive identifications and quantifications of protein topography. In addition, it now provides: (i) imprints of the negative volumes of pockets, cavities and channels, (ii) topographic features of biological assemblies in the Protein Data Bank, (iii) improved visualization of protein structures and pockets, and (iv) more intuitive structural and annotated information, including information of secondary structure, functional sites, variant sites and other annotations of protein residues. The CASTp 3.0 web server is freely accessible at http://sts.bioe.uic.edu/castp/.

  15. Diffusion of Cd and Te adatoms on CdTe(111) surfaces: A computational study using density functional theory

    Energy Technology Data Exchange (ETDEWEB)

    Naderi, Ebadollah, E-mail: enaderi42@gmail.com [Department of Physics, Savitribai Phule Pune University (SPPU), Pune-411007 (India); Nanavati, Sachin [Center for Development of Advanced Computing (C-DAC), SPPU campus, Pune 411007 (India); Majumder, Chiranjib [Chemistry Division, Bhabha Atomic Research Center, Mumbai, 400085 (India); Ghaisas, S. V. [Department of Electronic Science, Savitribai Phule Pune University (SPPU), Pune-411007 (India); Department of Physics, Savitribai Phule Pune University (SPPU), Pune-411007 (India)

    2015-01-15

    CdTe is one of the most promising semiconductor for thin-film based solar cells. Here we report a computational study of Cd and Te adatom diffusion on the CdTe (111) A-type (Cd terminated) and B-type (Te terminated) surfaces and their migration paths. The atomic and electronic structure calculations are performed under the DFT formalism and climbing Nudge Elastic Band (cNEB) method has been applied to evaluate the potential barrier of the Te and Cd diffusion. In general the minimum energy site on the surface is labeled as A{sub a} site. In case of Te and Cd on B-type surface, the sub-surface site (a site just below the top surface) is very close in energy to the A site. This is responsible for the subsurface accumulation of adatoms and therefore, expected to influence the defect formation during growth. The diffusion process of adatoms is considered from A{sub a} (occupied) to A{sub a} (empty) site at the nearest distance. We have explored three possible migration paths for the adatom diffusion. The adatom surface interaction is highly dependent on the type of the surface. Typically, Te interaction with both type (5.2 eV for A-type and 3.8 eV for B-type) is stronger than Cd interactions(2.4 eV for B-type and 0.39 eV for A-type). Cd interaction with the A-type surface is very weak. The distinct behavior of the A-type and B-type surfaces perceived in our study explain the need of maintaining the A-type surface during growth for smooth and stoichiometric growth.

  16. Diffusion of Cd and Te adatoms on CdTe(111) surfaces: A computational study using density functional theory

    Science.gov (United States)

    Naderi, Ebadollah; Nanavati, Sachin; Majumder, Chiranjib; Ghaisas, S. V.

    2015-01-01

    CdTe is one of the most promising semiconductor for thin-film based solar cells. Here we report a computational study of Cd and Te adatom diffusion on the CdTe (111) A-type (Cd terminated) and B-type (Te terminated) surfaces and their migration paths. The atomic and electronic structure calculations are performed under the DFT formalism and climbing Nudge Elastic Band (cNEB) method has been applied to evaluate the potential barrier of the Te and Cd diffusion. In general the minimum energy site on the surface is labeled as Aa site. In case of Te and Cd on B-type surface, the sub-surface site (a site just below the top surface) is very close in energy to the A site. This is responsible for the subsurface accumulation of adatoms and therefore, expected to influence the defect formation during growth. The diffusion process of adatoms is considered from Aa (occupied) to Aa (empty) site at the nearest distance. We have explored three possible migration paths for the adatom diffusion. The adatom surface interaction is highly dependent on the type of the surface. Typically, Te interaction with both type (5.2 eV for A-type and 3.8 eV for B-type) is stronger than Cd interactions(2.4 eV for B-type and 0.39 eV for A-type). Cd interaction with the A-type surface is very weak. The distinct behavior of the A-type and B-type surfaces perceived in our study explain the need of maintaining the A-type surface during growth for smooth and stoichiometric growth.

  17. Diffusion of Cd and Te adatoms on CdTe(111) surfaces: A computational study using density functional theory

    International Nuclear Information System (INIS)

    Naderi, Ebadollah; Nanavati, Sachin; Majumder, Chiranjib; Ghaisas, S. V.

    2015-01-01

    CdTe is one of the most promising semiconductor for thin-film based solar cells. Here we report a computational study of Cd and Te adatom diffusion on the CdTe (111) A-type (Cd terminated) and B-type (Te terminated) surfaces and their migration paths. The atomic and electronic structure calculations are performed under the DFT formalism and climbing Nudge Elastic Band (cNEB) method has been applied to evaluate the potential barrier of the Te and Cd diffusion. In general the minimum energy site on the surface is labeled as A a site. In case of Te and Cd on B-type surface, the sub-surface site (a site just below the top surface) is very close in energy to the A site. This is responsible for the subsurface accumulation of adatoms and therefore, expected to influence the defect formation during growth. The diffusion process of adatoms is considered from A a (occupied) to A a (empty) site at the nearest distance. We have explored three possible migration paths for the adatom diffusion. The adatom surface interaction is highly dependent on the type of the surface. Typically, Te interaction with both type (5.2 eV for A-type and 3.8 eV for B-type) is stronger than Cd interactions(2.4 eV for B-type and 0.39 eV for A-type). Cd interaction with the A-type surface is very weak. The distinct behavior of the A-type and B-type surfaces perceived in our study explain the need of maintaining the A-type surface during growth for smooth and stoichiometric growth

  18. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    Directory of Open Access Journals (Sweden)

    Pirouz Nourian

    2018-03-01

    Full Text Available This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential and applicability in urban planning and urban data analytics. This review is not only based on the technical factors such as capabilities of the programming languages but also the ease of developing and sharing complex data processing workflows. The arena of web-based computing platforms is currently under rapid development and is too volatile to be predictable; therefore, in this article we focus on the specification of the requirements and potentials from an urban planning point of view rather than speculating about the fate of computing platforms or programming languages. The article presents a list of promising computing technologies, a technical specification of the essential data models and operators for geo-spatial data processing, and mathematical models for an ideal urban computing platform.

  19. Computational Study of Environmental Effects on Torsional Free Energy Surface of N-Acetyl-N'-methyl-L-alanylamide Dipeptide

    Science.gov (United States)

    Carlotto, Silvia; Zerbetto, Mirco

    2014-01-01

    We propose an articulated computational experiment in which both quantum mechanics (QM) and molecular mechanics (MM) methods are employed to investigate environment effects on the free energy surface for the backbone dihedral angles rotation of the small dipeptide N-Acetyl-N'-methyl-L-alanylamide. This computation exercise is appropriate for an…

  20. Computer based training: Technology and trends

    International Nuclear Information System (INIS)

    O'Neal, A.F.

    1986-01-01

    Computer Based Training (CBT) offers great potential for revolutionizing the training environment. Tremendous advances in computer cost performance, instructional design science, and authoring systems have combined to put CBT within the reach of all. The ability of today's CBT systems to implement powerful training strategies, simulate complex processes and systems, and individualize and control the training process make it certain that CBT will now, at long last, live up to its potential. This paper reviews the major technologies and trends involved and offers some suggestions for getting started in CBT

  1. Computational Fragment-Based Drug Design: Current Trends, Strategies, and Applications.

    Science.gov (United States)

    Bian, Yuemin; Xie, Xiang-Qun Sean

    2018-04-09

    Fragment-based drug design (FBDD) has become an effective methodology for drug development for decades. Successful applications of this strategy brought both opportunities and challenges to the field of Pharmaceutical Science. Recent progress in the computational fragment-based drug design provide an additional approach for future research in a time- and labor-efficient manner. Combining multiple in silico methodologies, computational FBDD possesses flexibilities on fragment library selection, protein model generation, and fragments/compounds docking mode prediction. These characteristics provide computational FBDD superiority in designing novel and potential compounds for a certain target. The purpose of this review is to discuss the latest advances, ranging from commonly used strategies to novel concepts and technologies in computational fragment-based drug design. Particularly, in this review, specifications and advantages are compared between experimental and computational FBDD, and additionally, limitations and future prospective are discussed and emphasized.

  2. The Activity-Based Computing Project - A Software Architecture for Pervasive Computing Final Report

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    . Special attention should be drawn to publication [25], which gives an overview of the ABC project to the IEEE Pervasive Computing community; the ACM CHI 2006 [19] paper that documents the implementation of the ABC technology; and the ACM ToCHI paper [12], which is the main publication of the project......, documenting all of the project’s four objectives. All of these publication venues are top-tier journals and conferences within computer science. From a business perspective, the project had the objective of incorporating relevant parts of the ABC technology into the products of Medical Insight, which has been...... done. Moreover, partly based on the research done in the ABC project, the company Cetrea A/S has been founded, which incorporate ABC concepts and technologies in its products. The concepts of activity-based computing have also been researched in cooperation with IBM Research, and the ABC project has...

  3. The mARM spatially distributed soil evolution model: A computationally efficient modeling framework and analysis of hillslope soil surface organization

    Science.gov (United States)

    Cohen, Sagy; Willgoose, Garry; Hancock, Greg

    2009-09-01

    Hillslope surface armouring and weathering processes have received little attention in geomorphologic and hydrologic models due to their complexity and the uncertainty associated with them. Their importance, however, in a wide range of spatial processes is well recognized. A physically based armouring and weathering computer model (ARMOUR) has previously been used to successfully simulate the effect of these processes on erosion and soil grading at a hillslope scale. This model is, however, computationally complex and cannot realistically be applied over large areas or over long periods of time. A simplified process conceptualization approach is presented (named mARM) which uses a novel approach of modeling physical processes using transition matrices, which is orders of magnitude faster. We describe in detail the modeling framework. We calibrate and evaluate the model against ARMOUR simulations and show it matches ARMOUR for a range of conditions. The computational efficiency of mARM allowed us to easily examine time- and space-varying relationships between erosion and physical weathering rates at the hillslope scale. For erosion-dominated slopes the surface coarsens over time, while for weathering domination the surface fines over time. When erosion and weathering are comparable in scale a slope can be weathering-dominated upslope (where runoff and therefore erosion is low) and armouring-dominated downslope. In all cases, for a constant gradient slope the surface armour coarsens downslope as a result of a balance between erosion and weathering. Thus even for weathering-dominated slopes the surface grading catena is dependent on armouring through the balance between weathering and armouring. We also observed that for many slopes the surface initially armours but, after some period of time (space- and rate-dependent), weathering begins to dominate and the surface subsequently fines. Depending on the relative magnitude of armouring and weathering the final

  4. Rh-Based Mixed Alcohol Synthesis Catalysts: Characterization and Computational Report

    Energy Technology Data Exchange (ETDEWEB)

    Albrecht, Karl O.; Glezakou, Vassiliki Alexandra; Rousseau, Roger J.; Engelhard, Mark H.; Varga, Tamas; Colby, Robert J.; Jaffe, John E.; Li, Xiaohong S.; Mei, Donghai; Windisch, Charles F.; Kathmann, Shawn M.; Lemmon, Teresa L.; Gray, Michel J.; Hart, Todd R.; Thompson, Becky L.; Gerber, Mark A.

    2013-08-01

    The U.S. Department of Energy is conducting a program focused on developing a process for the conversion of biomass to bio-based fuels and co-products. Biomass-derived syngas is converted thermochemically within a temperature range of 240 to 330°C and at elevated pressure (e.g., 1200 psig) over a catalyst. Ethanol is the desired reaction product, although other side compounds are produced, including C3 to C5 alcohols; higher (i.e., greater than C1) oxygenates such as methyl acetate, ethyl acetate, acetic acid and acetaldehyde; and higher hydrocarbon gases such as methane, ethane/ethene, propane/propene, etc. Saturated hydrocarbon gases (especially methane) are undesirable because they represent a diminished yield of carbon to the desired ethanol product and represent compounds that must be steam reformed at high energy cost to reproduce CO and H2. Ethanol produced by the thermochemical reaction of syngas could be separated and blended directly with gasoline to produce a liquid transportation fuel. Additionally, higher oxygenates and unsaturated hydrocarbon side products such as olefins also could be further processed to liquid fuels. The goal of the current project is the development of a Rh-based catalyst with high activity and selectivity to C2+ oxygenates. This report chronicles an effort to characterize numerous supports and catalysts to identify particular traits that could be correlated with the most active and/or selective catalysts. Carbon and silica supports and catalysts were analyzed. Generally, analyses provided guidance in the selection of acceptable catalyst supports. For example, supports with high surface areas due to a high number of micropores were generally found to be poor at producing oxygenates, possibly because of mass transfer limitations of the products formed out of the micropores. To probe fundamental aspects of the complicated reaction network of CO with H2, a computational/ theoretical investigation using quantum mechanical and ab

  5. Development of Computer-Based Resources for Textile Education.

    Science.gov (United States)

    Hopkins, Teresa; Thomas, Andrew; Bailey, Mike

    1998-01-01

    Describes the production of computer-based resources for students of textiles and engineering in the United Kingdom. Highlights include funding by the Teaching and Learning Technology Programme (TLTP), courseware author/subject expert interaction, usage test and evaluation, authoring software, graphics, computer-aided design simulation, self-test…

  6. Biomimetic PDMS-hydroxyurethane terminated with catecholic moieties for chemical grafting on transition metal oxide-based surfaces

    Science.gov (United States)

    de Aguiar, Kelen R.; Rischka, Klaus; Gätjen, Linda; Noeske, Paul-Ludwig Michael; Cavalcanti, Welchy Leite; Rodrigues-Filho, Ubirajara P.

    2018-01-01

    The aim of this work was to synthesize a non-isocyanate poly(dimethylsiloxane) hydroxyurethane with biomimetic terminal catechol moieties, as a candidate for inorganic and metallic surface modification. Such surface modifier is capable to strongly attach onto metallic and inorganic substrates forming layers and, in addition, providing water-repellent surfaces. The non-isocyanate route is based on carbon dioxide cycloaddition into bis-epoxide, resulting in a precursor bis(cyclic carbonate)-polydimethylsiloxane (CCPDMS), thus fully replacing isocyanate in the manufacture process. A biomimetic approach was chosen with the molecular composition being inspired by terminal peptides present in adhesive proteins of mussels, like Mefp (Mytilus edulis foot protein), which bear catechol moieties and are strong adhesives even under natural and saline water. The catechol terminal groups were grafted by aminolysis reaction into a polydimethylsiloxane backbone. The product, PDMSUr-Dopamine, presented high affinity towards inhomogeneous alloy surfaces terminated by native oxide layers as demonstrated by quartz crystal microbalance (QCM-D), as well as stability against desorption by rinsing with ethanol. As revealed by QCM-D, X-ray photoelectron spectroscopy (XPS) and computational studies, the thickness and composition of the resulting nanolayers indicated an attachment of PDMSUr-Dopamine molecules to the substrate through both terminal catechol groups, with the adsorbate exposing the hydrophobic PDMS backbone. This hypothesis was investigated by classical molecular dynamic simulation (MD) of pure PDMSUr-Dopamine molecules on SiO2 surfaces. The computationally obtained PDMSUr-Dopamine assembly is in agreement with the conclusions from the experiments regarding the conformation of PDMSUr-Dopamine towards the surface. The tendency of the terminal catechol groups to approach the surface is in agreement with proposed model for the attachment PDMSUr-Dopamine. Remarkably, the versatile

  7. Enhancing school-based asthma education efforts using computer-based education for children.

    Science.gov (United States)

    Nabors, Laura A; Kockritz, Jennifer L; Ludke, Robert L; Bernstein, Jonathan A

    2012-03-01

    Schools are an important site for delivery of asthma education programs. Computer-based educational programs are a critical component of asthma education programs and may be a particularly important education method in busy school environments. The objective of this brief report is to review and critique computer-based education efforts in schools. The results of our literature review indicated that school-based computer education efforts are related to improved knowledge about asthma and its management. In some studies, improvements in clinical outcomes also occur. Data collection programs need to be built into games that improve knowledge. Many projects do not appear to last for periods greater than 1 year and little information is available about cultural relevance of these programs. Educational games and other programs are effective methods of delivering knowledge about asthma management and control. Research about the long-term effects of this increased knowledge, in regard to behavior change, is needed. Additionally, developing sustainable projects, which are culturally relevant, is a goal for future research.

  8. Computation of the temperatures of a fluid flowing through a pipe from temperature measurements on the pipe's outer surface

    International Nuclear Information System (INIS)

    Sauer, G.

    1999-01-01

    A method for computing the temperatures of a fluid flowing through a pipe on the basis of temperatures recorded at the pipe's outer surface is presented. The heat conduction in the pipe wall is described by one-dimensional heat conduction elements. Heat transfer between fluid, pipe and surrounding is allowed for. The equation system resulting from the standard finite element discretization is reformulated to enable the computation of temperature events preceding the recorded temperature in time. It is shown that the method can be used to identify the actual fluid temperature from temperature data obtained only at the outer surface of the pipe. The temperatures in the pipe wall are computed with good accuracy even in the case of a severe thermal shock. (orig.) [de

  9. Computer program for post-flight evaluation of the control surface response for an attitude controlled missile

    Science.gov (United States)

    Knauber, R. N.

    1982-01-01

    A FORTRAN IV coded computer program is presented for post-flight analysis of a missile's control surface response. It includes preprocessing of digitized telemetry data for time lags, biases, non-linear calibration changes and filtering. Measurements include autopilot attitude rate and displacement gyro output and four control surface deflections. Simple first order lags are assumed for the pitch, yaw and roll axes of control. Each actuator is also assumed to be represented by a first order lag. Mixing of pitch, yaw and roll commands to four control surfaces is assumed. A pseudo-inverse technique is used to obtain the pitch, yaw and roll components from the four measured deflections. This program has been used for over 10 years on the NASA/SCOUT launch vehicle for post-flight analysis and was helpful in detecting incipient actuator stall due to excessive hinge moments. The program is currently set up for a CDC CYBER 175 computer system. It requires 34K words of memory and contains 675 cards. A sample problem presented herein including the optional plotting requires eleven (11) seconds of central processor time.

  10. Computer Based Test Untuk Seleksi Masuk Politeknik Negeri Bengkalis

    Directory of Open Access Journals (Sweden)

    Agus Tedyyana

    2017-11-01

    Full Text Available AbstrakPenyeleksian calon mahasiswa baru dapat dilakukan dengan aplikasi Computer Based Test (CBT. Metode yang digunakan meliputi teknik pengumpulan data, analisis sistem, model perancangan, implementasi dan pengujian. Penelitian ini menghasilkan aplikasi CBT dimana soal yang dimunculkan dari bank soal melalui proses pengacakan dengan tidak akan memunculkan soal yang sama dengan menggunakan metoda Fisher-Yates Shuffle. Dalam proses pengamanan informasi soal saat terhubung ke jaringan maka diperlukan teknik untuk penyandian pesan agar soal tersebut sebeum dimunculkan melewati proses enkripsi dan deskripsi data terlebih dahulu maka digunakan algoritma kriptografi  RSA. Metode perancangan perangkat lunak menggunakan model waterfall, perancangan database menggunakan entity relationship diagram, perancangan antarmuka menggunakan hypertext markup language (HTML Cascading Style Sheet (CSS dan jQuery serta diimplementasikan berbasis web dengan menggunakan bahasa pemrograman PHP dan database MySQL, Arsitektur jaringan yang digunakan aplikasi Computer Based Test adalah model jaringan client-server dengan jaringan Local Area Network (LAN. Kata kunci: Computer Based Test, Fisher-Yates Shuffle, Criptography, Local Area Network AbstractSelection of new student candidates can be done with Computer Based Test (CBT application. The methods used include data collection techniques, system analysis, design model, implementation and testing. This study produces a CBT application where the questions raised from the question bank through randomization process will not bring up the same problem using the Fisher-Yates Shuffle method. In the process of securing information about the problem when connected to the network it is necessary techniques for encoding the message so that the problem before appear through the process of encryption and description of data first then used RSA cryptography algorithm. Software design method using waterfall model, database design

  11. CAMAC based computer--computer communications via microprocessor data links

    International Nuclear Information System (INIS)

    Potter, J.M.; Machen, D.R.; Naivar, F.J.; Elkins, E.P.; Simmonds, D.D.

    1976-01-01

    Communications between the central control computer and remote, satellite data acquisition/control stations at The Clinton P. Anderson Meson Physics Facility (LAMPF) is presently accomplished through the use of CAMAC based Data Link Modules. With the advent of the microprocessor, a new philosophy for digital data communications has evolved. Data Link modules containing microprocessor controllers provide link management and communication network protocol through algorithms executed in the Data Link microprocessor

  12. Computation of antenna pattern correlation and MIMO performance by means of surface current distribution and spherical wave theory

    Directory of Open Access Journals (Sweden)

    O. Klemp

    2006-01-01

    Full Text Available In order to satisfy the stringent demand for an accurate prediction of MIMO channel capacity and diversity performance in wireless communications, more effective and suitable models that account for real antenna radiation behavior have to be taken into account. One of the main challenges is the accurate modeling of antenna correlation that is directly related to the amount of channel capacity or diversity gain which might be achieved in multi element antenna configurations. Therefore spherical wave theory in electromagnetics is a well known technique to express antenna far fields by means of a compact field expansion with a reduced number of unknowns that was recently applied to derive an analytical approach in the computation of antenna pattern correlation. In this paper we present a novel and efficient computational technique to determine antenna pattern correlation based on the evaluation of the surface current distribution by means of a spherical mode expansion.

  13. A wavelet-based PWTD algorithm-accelerated time domain surface integral equation solver

    KAUST Repository

    Liu, Yang

    2015-10-26

    © 2015 IEEE. The multilevel plane-wave time-domain (PWTD) algorithm allows for fast and accurate analysis of transient scattering from, and radiation by, electrically large and complex structures. When used in tandem with marching-on-in-time (MOT)-based surface integral equation (SIE) solvers, it reduces the computational and memory costs of transient analysis from equation and equation to equation and equation, respectively, where Nt and Ns denote the number of temporal and spatial unknowns (Ergin et al., IEEE Trans. Antennas Mag., 41, 39-52, 1999). In the past, PWTD-accelerated MOT-SIE solvers have been applied to transient problems involving half million spatial unknowns (Shanker et al., IEEE Trans. Antennas Propag., 51, 628-641, 2003). Recently, a scalable parallel PWTD-accelerated MOT-SIE solver that leverages a hiearchical parallelization strategy has been developed and successfully applied to the transient problems involving ten million spatial unknowns (Liu et. al., in URSI Digest, 2013). We further enhanced the capabilities of this solver by implementing a compression scheme based on local cosine wavelet bases (LCBs) that exploits the sparsity in the temporal dimension (Liu et. al., in URSI Digest, 2014). Specifically, the LCB compression scheme was used to reduce the memory requirement of the PWTD ray data and computational cost of operations in the PWTD translation stage.

  14. The use of computational thermodynamics for the determination of surface tension and Gibbs-Thomson coefficient of multicomponent alloys

    Science.gov (United States)

    Ferreira, D. J. S.; Bezerra, B. N.; Collyer, M. N.; Garcia, A.; Ferreira, I. L.

    2018-04-01

    The simulation of casting processes demands accurate information on the thermophysical properties of the alloy; however, such information is scarce in the literature for multicomponent alloys. Generally, metallic alloys applied in industry have more than three solute components. In the present study, a general solution of Butler's formulation for surface tension is presented for multicomponent alloys and is applied in quaternary Al-Cu-Si-Fe alloys, thus permitting the Gibbs-Thomson coefficient to be determined. Such coefficient is a determining factor to the reliability of predictions furnished by microstructure growth models and by numerical computations of solidification thermal parameters, which will depend on the thermophysical properties assumed in the calculations. The Gibbs-Thomson coefficient for ternary and quaternary alloys is seldom reported in the literature. A numerical model based on Powell's hybrid algorithm and a finite difference Jacobian approximation has been coupled to a Thermo-Calc TCAPI interface to assess the excess Gibbs energy of the liquid phase, permitting liquidus temperature, latent heat, alloy density, surface tension and Gibbs-Thomson coefficient for Al-Cu-Si-Fe hypoeutectic alloys to be calculated, as an example of calculation capabilities for multicomponent alloys of the proposed method. The computed results are compared with thermophysical properties of binary Al-Cu and ternary Al-Cu-Si alloys found in the literature and presented as a function of the Cu solute composition.

  15. Computer-Based Technologies in Dentistry: Types and Applications

    Directory of Open Access Journals (Sweden)

    Rajaa Mahdi Musawi

    2016-10-01

    Full Text Available During dental education, dental students learn how to examine patients, make diagnosis, plan treatment and perform dental procedures perfectly and efficiently. However, progresses in computer-based technologies including virtual reality (VR simulators, augmented reality (AR and computer aided design/computer aided manufacturing (CAD/CAM systems have resulted in new modalities for instruction and practice of dentistry. Virtual reality dental simulators enable repeated, objective and assessable practice in various controlled situations. Superimposition of three-dimensional (3D virtual images on actual images in AR allows surgeons to simultaneously visualize the surgical site and superimpose informative 3D images of invisible regions on the surgical site to serve as a guide. The use of CAD/CAM systems for designing and manufacturing of dental appliances and prostheses has been well established.This article reviews computer-based technologies, their application in dentistry and their potentials and limitations in promoting dental education, training and practice. Practitioners will be able to choose from a broader spectrum of options in their field of practice by becoming familiar with new modalities of training and practice.Keywords: Virtual Reality Exposure Therapy; Immersion; Computer-Aided Design; Dentistry; Education

  16. Medical imaging in clinical applications algorithmic and computer-based approaches

    CERN Document Server

    Bhateja, Vikrant; Hassanien, Aboul

    2016-01-01

    This volume comprises of 21 selected chapters, including two overview chapters devoted to abdominal imaging in clinical applications supported computer aided diagnosis approaches as well as different techniques for solving the pectoral muscle extraction problem in the preprocessing part of the CAD systems for detecting breast cancer in its early stage using digital mammograms. The aim of this book is to stimulate further research in medical imaging applications based algorithmic and computer based approaches and utilize them in real-world clinical applications. The book is divided into four parts, Part-I: Clinical Applications of Medical Imaging, Part-II: Classification and clustering, Part-III: Computer Aided Diagnosis (CAD) Tools and Case Studies and Part-IV: Bio-inspiring based Computer Aided diagnosis techniques. .

  17. Reliability Evaluation of Bridges Based on Nonprobabilistic Response Surface Limit Method

    Directory of Open Access Journals (Sweden)

    Xuyong Chen

    2017-01-01

    Full Text Available Due to many uncertainties in nonprobabilistic reliability assessment of bridges, the limit state function is generally unknown. The traditional nonprobabilistic response surface method is a lengthy and oscillating iteration process and leads to difficultly solving the nonprobabilistic reliability index. This article proposes a nonprobabilistic response surface limit method based on the interval model. The intention of this method is to solve the upper and lower limits of the nonprobabilistic reliability index and to narrow the range of the nonprobabilistic reliability index. If the range of the reliability index reduces to an acceptable accuracy, the solution will be considered convergent, and the nonprobabilistic reliability index will be obtained. The case study indicates that using the proposed method can avoid oscillating iteration process, make iteration process stable and convergent, reduce iteration steps significantly, and improve computational efficiency and precision significantly compared with the traditional nonprobabilistic response surface method. Finally, the nonprobabilistic reliability evaluation process of bridge will be built through evaluating the reliability of one PC continuous rigid frame bridge with three spans using the proposed method, which appears to be more simple and reliable when lack of samples and parameters in the bridge nonprobabilistic reliability evaluation is present.

  18. Computer simulation study of the displacement threshold-energy surface in Cu

    International Nuclear Information System (INIS)

    King, W.E.; Benedek, R.

    1981-01-01

    Computer simulations were performed using the molecular-dynamics technique to determine the directional dependence of the threshold energy for production of stable Frenkel pairs in copper. Sharp peaks were observed in the simulated threshold energy surface in between the low-index directions. Threshold energies ranged from approx.25 eV for directions near or to 180 eV at the position of the peak between and . The general topographical features of the simulated threshold-energy surface are in good agreement with those determined from an analysis of recent experiments by King et al. on the basis of a Frenkel-pair resistivity rho/sub F/ = 2.85 x 10 -4 Ω cm. Evidence is presented in favor of this number as opposed to the usually assumed value, rho/sub F/ = 2.00 x 10 -4 Ω cm. The energy dependence of defect production in a number of directions was investigated to determine the importance of nonproductive events above threshold

  19. Spin-based quantum computation in multielectron quantum dots

    OpenAIRE

    Hu, Xuedong; Sarma, S. Das

    2001-01-01

    In a quantum computer the hardware and software are intrinsically connected because the quantum Hamiltonian (or more precisely its time development) is the code that runs the computer. We demonstrate this subtle and crucial relationship by considering the example of electron-spin-based solid state quantum computer in semiconductor quantum dots. We show that multielectron quantum dots with one valence electron in the outermost shell do not behave simply as an effective single spin system unles...

  20. Optimization of the Upper Surface of Hypersonic Vehicle Based on CFD Analysis

    Science.gov (United States)

    Gao, T. Y.; Cui, K.; Hu, S. C.; Wang, X. P.; Yang, G. W.

    2011-09-01

    For the hypersonic vehicle, the aerodynamic performance becomes more intensive. Therefore, it is a significant event to optimize the shape of the hypersonic vehicle to achieve the project demands. It is a key technology to promote the performance of the hypersonic vehicle with the method of shape optimization. Based on the existing vehicle, the optimization to the upper surface of the Simplified hypersonic vehicle was done to obtain a shape which suits the project demand. At the cruising condition, the upper surface was parameterized with the B-Spline curve method. The incremental parametric method and the reconstruction technology of the local mesh were applied here. The whole flow field was been calculated and the aerodynamic performance of the craft were obtained by the computational fluid dynamic (CFD) technology. Then the vehicle shape was optimized to achieve the maximum lift-drag ratio at attack angle 3°, 4° and 5°. The results will provide the reference for the practical design.

  1. Energy-Aware Computation Offloading of IoT Sensors in Cloudlet-Based Mobile Edge Computing.

    Science.gov (United States)

    Ma, Xiao; Lin, Chuang; Zhang, Han; Liu, Jianwei

    2018-06-15

    Mobile edge computing is proposed as a promising computing paradigm to relieve the excessive burden of data centers and mobile networks, which is induced by the rapid growth of Internet of Things (IoT). This work introduces the cloud-assisted multi-cloudlet framework to provision scalable services in cloudlet-based mobile edge computing. Due to the constrained computation resources of cloudlets and limited communication resources of wireless access points (APs), IoT sensors with identical computation offloading decisions interact with each other. To optimize the processing delay and energy consumption of computation tasks, theoretic analysis of the computation offloading decision problem of IoT sensors is presented in this paper. In more detail, the computation offloading decision problem of IoT sensors is formulated as a computation offloading game and the condition of Nash equilibrium is derived by introducing the tool of a potential game. By exploiting the finite improvement property of the game, the Computation Offloading Decision (COD) algorithm is designed to provide decentralized computation offloading strategies for IoT sensors. Simulation results demonstrate that the COD algorithm can significantly reduce the system cost compared with the random-selection algorithm and the cloud-first algorithm. Furthermore, the COD algorithm can scale well with increasing IoT sensors.

  2. Films made of cellulose nanofibrils: surface modification by adsorption of a cationic surfactant and characterization by computer-assisted electron microscopy

    International Nuclear Information System (INIS)

    Syverud, K.; Xhanari, K.; Chinga-Carrasco, G.; Yu, Y.; Stenius, P.

    2011-01-01

    Films made of nanofibrils were modified by adsorption of a cationic surfactant directly on the film surfaces. The nanofibrils were prepared by 2,2,6,6-tetramethylpiperidinyl-1-oxyl (TEMPO)-mediated oxidation and mechanical fibrillation, and were relatively homogeneous in size. The average nanofibril diameter and surface porosity was quantified based on computer-assisted field-emission scanning electron microscopy (FE-SEM). The cationic surfactant used in the adsorption was n-hexadecyl trimethylammonium bromide (cetyltrimethylammonium bromide, CTAB). The adsorption of CTAB was confirmed by Fourier transform infrared (FTIR) spectroscopy and high-resolution transmission electron microscopy (HRTEM) analyses. It was shown that the adsorbed layer of CTAB increased the hydrophobicity, without affecting the tensile index significantly. This capability, combined with the antiseptic properties of CTAB, may be a major advantage for several applications.

  3. Sensitivity analysis of brain morphometry based on MRI-derived surface models

    Science.gov (United States)

    Klein, Gregory J.; Teng, Xia; Schoenemann, P. T.; Budinger, Thomas F.

    1998-07-01

    Quantification of brain structure is important for evaluating changes in brain size with growth and aging and for characterizing neurodegeneration disorders. Previous quantification efforts using ex vivo techniques suffered considerable error due to shrinkage of the cerebrum after extraction from the skull, deformation of slices during sectioning, and numerous other factors. In vivo imaging studies of brain anatomy avoid these problems and allow repetitive studies following progression of brain structure changes due to disease or natural processes. We have developed a methodology for obtaining triangular mesh models of the cortical surface from MRI brain datasets. The cortex is segmented from nonbrain tissue using a 2D region-growing technique combined with occasional manual edits. Once segmented, thresholding and image morphological operations (erosions and openings) are used to expose the regions between adjacent surfaces in deep cortical folds. A 2D region- following procedure is then used to find a set of contours outlining the cortical boundary on each slice. The contours on all slices are tiled together to form a closed triangular mesh model approximating the cortical surface. This model can be used for calculation of cortical surface area and volume, as well as other parameters of interest. Except for the initial segmentation of the cortex from the skull, the technique is automatic and requires only modest computation time on modern workstations. Though the use of image data avoids many of the pitfalls of ex vivo and sectioning techniques, our MRI-based technique is still vulnerable to errors that may impact the accuracy of estimated brain structure parameters. Potential inaccuracies include segmentation errors due to incorrect thresholding, missed deep sulcal surfaces, falsely segmented holes due to image noise and surface tiling artifacts. The focus of this paper is the characterization of these errors and how they affect measurements of cortical surface

  4. Comparability of Computer-based and Paper-based Versions of Writing Section of PET in Iranian EFL Context

    OpenAIRE

    Mohammad Mohammadi; Masoud Barzgaran

    2010-01-01

    Computer technology has provided language testing experts with opportunity to develop computerized versions of traditional paper-based language tests. New generations of TOEFL and Cambridge IELTS, BULATS, KET, PET are good examples of computer-based language tests. Since this new method of testing introduces new factors into the realm of language assessment ( e.g. modes of test delivery, familiarity with computer, etc.),the question may be whether the two modes of computer- and paper-based te...

  5. Computer-Based Self-Instructional Modules. Final Technical Report.

    Science.gov (United States)

    Weinstock, Harold

    Reported is a project involving seven chemists, six mathematicians, and six physicists in the production of computer-based, self-study modules for use in introductory college courses in chemistry, physics, and mathematics. These modules were designed to be used by students and instructors with little or no computer backgrounds, in institutions…

  6. Essential Means for Urban Computing : Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    NARCIS (Netherlands)

    Nourian, P.; Martinez-Ortiz, Carlos; Arroyo Ohori, G.A.K.

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages,

  7. THE INTEGRATED USE OF COMPUTATIONAL CHEMISTRY, SCANNING PROBE MICROSCOPY, AND VIRTUAL REALITY TO PREDICT THE CHEMICAL REACTIVITY OF ENVIRONMENTAL SURFACES

    Science.gov (United States)

    In the last decade three new techniques scanning probe microscopy (SPM), virtual reality (YR) and computational chemistry ave emerged with the combined capability of a priori predicting the chemically reactivity of environmental surfaces. Computational chemistry provides the cap...

  8. Data Mining Based on Cloud-Computing Technology

    Directory of Open Access Journals (Sweden)

    Ren Ying

    2016-01-01

    Full Text Available There are performance bottlenecks and scalability problems when traditional data-mining system is used in cloud computing. In this paper, we present a data-mining platform based on cloud computing. Compared with a traditional data mining system, this platform is highly scalable, has massive data processing capacities, is service-oriented, and has low hardware cost. This platform can support the design and applications of a wide range of distributed data-mining systems.

  9. Internet messenger based smart virtual class learning using ubiquitous computing

    Science.gov (United States)

    Umam, K.; Mardi, S. N. S.; Hariadi, M.

    2017-06-01

    Internet messenger (IM) has become an important educational technology component in college education, IM makes it possible for students to engage in learning and collaborating at smart virtual class learning (SVCL) using ubiquitous computing. However, the model of IM-based smart virtual class learning using ubiquitous computing and empirical evidence that would favor a broad application to improve engagement and behavior are still limited. In addition, the expectation that IM based SVCL using ubiquitous computing could improve engagement and behavior on smart class cannot be confirmed because the majority of the reviewed studies followed instructions paradigms. This article aims to present the model of IM-based SVCL using ubiquitous computing and showing learners’ experiences in improved engagement and behavior for learner-learner and learner-lecturer interactions. The method applied in this paper includes design process and quantitative analysis techniques, with the purpose of identifying scenarios of ubiquitous computing and realize the impressions of learners and lecturers about engagement and behavior aspect and its contribution to learning

  10. Evaluating Computer-Based Assessment in a Risk-Based Model

    Science.gov (United States)

    Zakrzewski, Stan; Steven, Christine; Ricketts, Chris

    2009-01-01

    There are three purposes for evaluation: evaluation for action to aid the decision making process, evaluation for understanding to further enhance enlightenment and evaluation for control to ensure compliance to standards. This article argues that the primary function of evaluation in the "Catherine Wheel" computer-based assessment (CBA)…

  11. An E-learning System based on Affective Computing

    Science.gov (United States)

    Duo, Sun; Song, Lu Xue

    In recent years, e-learning as a learning system is very popular. But the current e-learning systems cannot instruct students effectively since they do not consider the emotional state in the context of instruction. The emergence of the theory about "Affective computing" can solve this question. It can make the computer's intelligence no longer be a pure cognitive one. In this paper, we construct an emotional intelligent e-learning system based on "Affective computing". A dimensional model is put forward to recognize and analyze the student's emotion state and a virtual teacher's avatar is offered to regulate student's learning psychology with consideration of teaching style based on his personality trait. A "man-to-man" learning environment is built to simulate the traditional classroom's pedagogy in the system.

  12. WebCN: A web-based computation tool for in situ-produced cosmogenic nuclides

    Energy Technology Data Exchange (ETDEWEB)

    Ma Xiuzeng [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States)]. E-mail: hongju@purdue.edu; Li Yingkui [Department of Geography, University of Missouri-Columbia, Columbia, MO 65211 (United States); Bourgeois, Mike [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Caffee, Marc [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Elmore, David [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Granger, Darryl [Department of Earth and Atmospheric Sciences, Purdue University, West Lafayette, IN 47907 (United States); Muzikar, Paul [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Smith, Preston [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States)

    2007-06-15

    Cosmogenic nuclide techniques are increasingly being utilized in geoscience research. For this it is critical to establish an effective, easily accessible and well defined tool for cosmogenic nuclide computations. We have been developing a web-based tool (WebCN) to calculate surface exposure ages and erosion rates based on the nuclide concentrations measured by the accelerator mass spectrometry. WebCN for {sup 10}Be and {sup 26}Al has been finished and published at http://www.physics.purdue.edu/primelab/for{sub u}sers/rockage.html. WebCN for {sup 36}Cl is under construction. WebCN is designed as a three-tier client/server model and uses the open source PostgreSQL for the database management and PHP for the interface design and calculations. On the client side, an internet browser and Microsoft Access are used as application interfaces to access the system. Open Database Connectivity is used to link PostgreSQL and Microsoft Access. WebCN accounts for both spatial and temporal distributions of the cosmic ray flux to calculate the production rates of in situ-produced cosmogenic nuclides at the Earth's surface.

  13. Binding Direction-Based Two-Dimensional Flattened Contact Area Computing Algorithm for Protein-Protein Interactions.

    Science.gov (United States)

    Kang, Beom Sik; Pugalendhi, GaneshKumar; Kim, Ku-Jin

    2017-10-13

    Interactions between protein molecules are essential for the assembly, function, and regulation of proteins. The contact region between two protein molecules in a protein complex is usually complementary in shape for both molecules and the area of the contact region can be used to estimate the binding strength between two molecules. Although the area is a value calculated from the three-dimensional surface, it cannot represent the three-dimensional shape of the surface. Therefore, we propose an original concept of two-dimensional contact area which provides further information such as the ruggedness of the contact region. We present a novel algorithm for calculating the binding direction between two molecules in a protein complex, and then suggest a method to compute the two-dimensional flattened area of the contact region between two molecules based on the binding direction.

  14. Binding Direction-Based Two-Dimensional Flattened Contact Area Computing Algorithm for Protein–Protein Interactions

    Directory of Open Access Journals (Sweden)

    Beom Sik Kang

    2017-10-01

    Full Text Available Interactions between protein molecules are essential for the assembly, function, and regulation of proteins. The contact region between two protein molecules in a protein complex is usually complementary in shape for both molecules and the area of the contact region can be used to estimate the binding strength between two molecules. Although the area is a value calculated from the three-dimensional surface, it cannot represent the three-dimensional shape of the surface. Therefore, we propose an original concept of two-dimensional contact area which provides further information such as the ruggedness of the contact region. We present a novel algorithm for calculating the binding direction between two molecules in a protein complex, and then suggest a method to compute the two-dimensional flattened area of the contact region between two molecules based on the binding direction.

  15. Knowledge-based computer systems for radiotherapy planning.

    Science.gov (United States)

    Kalet, I J; Paluszynski, W

    1990-08-01

    Radiation therapy is one of the first areas of clinical medicine to utilize computers in support of routine clinical decision making. The role of the computer has evolved from simple dose calculations to elaborate interactive graphic three-dimensional simulations. These simulations can combine external irradiation from megavoltage photons, electrons, and particle beams with interstitial and intracavitary sources. With the flexibility and power of modern radiotherapy equipment and the ability of computer programs that simulate anything the machinery can do, we now face a challenge to utilize this capability to design more effective radiation treatments. How can we manage the increased complexity of sophisticated treatment planning? A promising approach will be to use artificial intelligence techniques to systematize our present knowledge about design of treatment plans, and to provide a framework for developing new treatment strategies. Far from replacing the physician, physicist, or dosimetrist, artificial intelligence-based software tools can assist the treatment planning team in producing more powerful and effective treatment plans. Research in progress using knowledge-based (AI) programming in treatment planning already has indicated the usefulness of such concepts as rule-based reasoning, hierarchical organization of knowledge, and reasoning from prototypes. Problems to be solved include how to handle continuously varying parameters and how to evaluate plans in order to direct improvements.

  16. Message-passing-interface-based parallel FDTD investigation on the EM scattering from a 1-D rough sea surface using uniaxial perfectly matched layer absorbing boundary.

    Science.gov (United States)

    Li, J; Guo, L-X; Zeng, H; Han, X-B

    2009-06-01

    A message-passing-interface (MPI)-based parallel finite-difference time-domain (FDTD) algorithm for the electromagnetic scattering from a 1-D randomly rough sea surface is presented. The uniaxial perfectly matched layer (UPML) medium is adopted for truncation of FDTD lattices, in which the finite-difference equations can be used for the total computation domain by properly choosing the uniaxial parameters. This makes the parallel FDTD algorithm easier to implement. The parallel performance with different processors is illustrated for one sea surface realization, and the computation time of the parallel FDTD algorithm is dramatically reduced compared to a single-process implementation. Finally, some numerical results are shown, including the backscattering characteristics of sea surface for different polarization and the bistatic scattering from a sea surface with large incident angle and large wind speed.

  17. Computer-based quantitative computed tomography image analysis in idiopathic pulmonary fibrosis: A mini review.

    Science.gov (United States)

    Ohkubo, Hirotsugu; Nakagawa, Hiroaki; Niimi, Akio

    2018-01-01

    Idiopathic pulmonary fibrosis (IPF) is the most common type of progressive idiopathic interstitial pneumonia in adults. Many computer-based image analysis methods of chest computed tomography (CT) used in patients with IPF include the mean CT value of the whole lungs, density histogram analysis, density mask technique, and texture classification methods. Most of these methods offer good assessment of pulmonary functions, disease progression, and mortality. Each method has merits that can be used in clinical practice. One of the texture classification methods is reported to be superior to visual CT scoring by radiologist for correlation with pulmonary function and prediction of mortality. In this mini review, we summarize the current literature on computer-based CT image analysis of IPF and discuss its limitations and several future directions. Copyright © 2017 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.

  18. Applications of computer based safety systems in Korea nuclear power plants

    International Nuclear Information System (INIS)

    Won Young Yun

    1998-01-01

    With the progress of computer technology, the applications of computer based safety systems in Korea nuclear power plants have increased rapidly in recent decades. The main purpose of this movement is to take advantage of modern computer technology so as to improve the operability and maintainability of the plants. However, in fact there have been a lot of controversies on computer based systems' safety between the regulatory body and nuclear utility in Korea. The Korea Institute of Nuclear Safety (KINS), technical support organization for nuclear plant licensing, is currently confronted with the pressure to set up well defined domestic regulatory requirements from this aspect. This paper presents the current status and the regulatory activities related to the applications of computer based safety systems in Korea. (author)

  19. Evolution of a Computer-Based Testing Laboratory

    Science.gov (United States)

    Moskal, Patrick; Caldwell, Richard; Ellis, Taylor

    2009-01-01

    In 2003, faced with increasing growth in technology-based and large-enrollment courses, the College of Business Administration at the University of Central Florida opened a computer-based testing lab to facilitate administration of course examinations. Patrick Moskal, Richard Caldwell, and Taylor Ellis describe the development and evolution of the…

  20. A CAMAC-based laboratory computer system

    International Nuclear Information System (INIS)

    Westphal, G.P.

    1975-01-01

    A CAMAC-based laboratory computer network is described by sharing a common mass memory this offers distinct advantages over slow and core-consuming single-processor installations. A fast compiler-BASIC, with extensions for CAMAC and real-time, provides a convenient means for interactive experiment control

  1. Fast algorithm for the rendering of three-dimensional surfaces

    Science.gov (United States)

    Pritt, Mark D.

    1994-02-01

    It is often desirable to draw a detailed and realistic representation of surface data on a computer graphics display. One such representation is a 3D shaded surface. Conventional techniques for rendering shaded surfaces are slow, however, and require substantial computational power. Furthermore, many techniques suffer from aliasing effects, which appear as jagged lines and edges. This paper describes an algorithm for the fast rendering of shaded surfaces without aliasing effects. It is much faster than conventional ray tracing and polygon-based rendering techniques and is suitable for interactive use. On an IBM RISC System/6000TM workstation it renders a 1000 X 1000 surface in about 7 seconds.

  2. Lowering the barriers to computational modeling of Earth's surface: coupling Jupyter Notebooks with Landlab, HydroShare, and CyberGIS for research and education.

    Science.gov (United States)

    Bandaragoda, C.; Castronova, A. M.; Phuong, J.; Istanbulluoglu, E.; Strauch, R. L.; Nudurupati, S. S.; Tarboton, D. G.; Wang, S. W.; Yin, D.; Barnhart, K. R.; Tucker, G. E.; Hutton, E.; Hobley, D. E. J.; Gasparini, N. M.; Adams, J. M.

    2017-12-01

    The ability to test hypotheses about hydrology, geomorphology and atmospheric processes is invaluable to research in the era of big data. Although community resources are available, there remain significant educational, logistical and time investment barriers to their use. Knowledge infrastructure is an emerging intellectual framework to understand how people are creating, sharing and distributing knowledge - which has been dramatically transformed by Internet technologies. In addition to the technical and social components in a cyberinfrastructure system, knowledge infrastructure considers educational, institutional, and open source governance components required to advance knowledge. We are designing an infrastructure environment that lowers common barriers to reproducing modeling experiments for earth surface investigation. Landlab is an open-source modeling toolkit for building, coupling, and exploring two-dimensional numerical models. HydroShare is an online collaborative environment for sharing hydrologic data and models. CyberGIS-Jupyter is an innovative cyberGIS framework for achieving data-intensive, reproducible, and scalable geospatial analytics using the Jupyter Notebook based on ROGER - the first cyberGIS supercomputer, so that models that can be elastically reproduced through cloud computing approaches. Our team of geomorphologists, hydrologists, and computer geoscientists has created a new infrastructure environment that combines these three pieces of software to enable knowledge discovery. Through this novel integration, any user can interactively execute and explore their shared data and model resources. Landlab on HydroShare with CyberGIS-Jupyter supports the modeling continuum from fully developed modelling applications, prototyping new science tools, hands on research demonstrations for training workshops, and classroom applications. Computational geospatial models based on big data and high performance computing can now be more efficiently

  3. Comparison of surface extraction techniques performance in computed tomography for 3D complex micro-geometry dimensional measurements

    DEFF Research Database (Denmark)

    Torralba, Marta; Jiménez, Roberto; Yagüe-Fabra, José A.

    2018-01-01

    micro-geometries as well (i.e., in the sub-mm dimensional range). However, there are different factors that may influence the CT process performance, being one of them the surface extraction technique used. In this paper, two different extraction techniques are applied to measure a complex miniaturized......The number of industrial applications of computed tomography (CT) for dimensional metrology in 100–103 mm range has been continuously increasing, especially in the last years. Due to its specific characteristics, CT has the potential to be employed as a viable solution for measuring 3D complex...... dental file by CT in order to analyze its contribution to the final measurement uncertainty in complex geometries at the mm to sub-mm scales. The first method is based on a similarity analysis: the threshold determination; while the second one is based on a gradient or discontinuity analysis: the 3D...

  4. Analysis of growth patterns during gravitropic curvature in roots of Zea mays by use of a computer-based video digitizer

    Science.gov (United States)

    Nelson, A. J.; Evans, M. L.

    1986-01-01

    A computer-based video digitizer system is described which allows automated tracking of markers placed on a plant surface. The system uses customized software to calculate relative growth rates at selected positions along the plant surface and to determine rates of gravitropic curvature based on the changing pattern of distribution of the surface markers. The system was used to study the time course of gravitropic curvature and changes in relative growth rate along the upper and lower surface of horizontally-oriented roots of maize (Zea mays L.). The growing region of the root was found to extend from about 1 mm behind the tip to approximately 6 mm behind the tip. In vertically-oriented roots the relative growth rate was maximal at about 2.5 mm behind the tip and declined smoothly on either side of the maximum. Curvature was initiated approximately 30 min after horizontal orientation with maximal (50 degrees) curvature being attained in 3 h. Analysis of surface extension patterns during the response indicated that curvature results from a reduction in growth rate along both the upper and lower surfaces with stronger reduction along the lower surface.

  5. A Comparative Study of Paper-based and Computer-based Contextualization in Vocabulary Learning of EFL Students

    Directory of Open Access Journals (Sweden)

    Mousa Ahmadian

    2015-04-01

    Full Text Available Vocabulary acquisition is one of the largest and most important tasks in language classes. New technologies, such as computers, have helped a lot in this way. The importance of the issue led the researchers to do the present study which concerns the comparison of contextualized vocabulary learning on paper and through Computer Assisted Language Learning (CALL. To this end, 52 Pre-university EFL learners were randomly assigned in two groups: a paper-based group (PB and a computer-based (CB group each with 26 learners. The PB group received PB contextualization of vocabulary items, while the CB group received CB contextualization of the vocabulary items thorough PowerPoint (PP software. One pretest, posttest, along with an immediate and a delayed posttest were given to the learners. Paired samples t-test of pretest and posttest and independent samples t-test of the delayed and immediate posttest were executed by SPSS software. The results revealed that computer-based contextualization had more effects on vocabulary learning of Iranian EFL learners than paper-based contextualization of the words. Keywords: Computer-based contextualization, Paper-based contextualization, Vocabulary learning, CALL

  6. The COSIMA-experiments, a data base for validation of two-phase flow computer codes

    International Nuclear Information System (INIS)

    Class, G.; Meyder, R.; Stratmanns, E.

    1985-12-01

    The report presents an overview on the large data base generated with COSIMA. The data base is to be used to validate and develop computer codes for two-phase flow. In terms of fuel rod behavior it was found that during blowdown under realistic conditions only small strains are reached. For clad rupture extremely high rod internal pressure is necessary. Additionally important results were found in the behavior of a fuel rod simulator and on the effect of thermocouples attached on the cladding outer surface. Post-test calculations, performed with the codes RELAP and DRUFAN show a good agreement with the experiments. This however can be improved if the phase separation models in the codes would be updated. (orig./HP) [de

  7. Security Framework for Agent-Based Cloud Computing

    Directory of Open Access Journals (Sweden)

    K Venkateshwaran

    2015-06-01

    Full Text Available Agent can play a key role in bringing suitable cloud services to the customer based on their requirements. In agent based cloud computing, agent does negotiation, coordination, cooperation and collaboration on behalf of the customer to make the decisions in efficient manner. However the agent based cloud computing have some security issues like (a. addition of malicious agent in the cloud environment which could demolish the process by attacking other agents, (b. denial of service by creating flooding attacks on other involved agents. (c. Some of the exceptions in the agent interaction protocol such as Not-Understood and Cancel_Meta protocol can be misused and may lead to terminating the connection of all the other agents participating in the negotiating services. Also, this paper proposes algorithms to solve these issues to ensure that there will be no intervention of any malicious activities during the agent interaction.

  8. Students' Mathematics Word Problem-Solving Achievement in a Computer-Based Story

    Science.gov (United States)

    Gunbas, N.

    2015-01-01

    The purpose of this study was to investigate the effect of a computer-based story, which was designed in anchored instruction framework, on sixth-grade students' mathematics word problem-solving achievement. Problems were embedded in a story presented on a computer as computer story, and then compared with the paper-based version of the same story…

  9. Computer-Based Molecular Modelling: Finnish School Teachers' Experiences and Views

    Science.gov (United States)

    Aksela, Maija; Lundell, Jan

    2008-01-01

    Modern computer-based molecular modelling opens up new possibilities for chemistry teaching at different levels. This article presents a case study seeking insight into Finnish school teachers' use of computer-based molecular modelling in teaching chemistry, into the different working and teaching methods used, and their opinions about necessary…

  10. Plasma based Ar+ beam assisted poly(dimethylsiloxane) surface modification

    International Nuclear Information System (INIS)

    Vladkova, T.G.; Keranov, I.L.; Dineff, P.D.; Youroukov, S.Y.; Avramova, I.A.; Krasteva, N.; Altankov, G.P.

    2005-01-01

    Plasma based Ar + beam performed in RF (13.56 MHz) low-pressure (200 mTorr) glow discharge (at 100 W, 1200 W and 2500 W) with a serial capacitance was employed for surface modification of poly(dimethylsiloxane) (PDMS) aimed at improvement of its interactions with living cells. The presence of a serial capacitance ensures arise of an ion-flow inside the plasma volume directed toward the treated sample and the vary of the discharge power ensures varied density of the ion-flow. XPS analysis was performed to study the changes in the surface chemical composition of the modified samples and the corresponding changes in the surface energy were monitored by contact angle measurements. We found that plasma based Ar + beam transforms the initially hydrophobic PDMS surface into a hydrophilic one mainly due to a raising of the polar component of the surface tension, this effect being most probably due to an enrichment of the modified surface layer with permanent dipoles of a [SiO x ]-based network and elimination of the original methyl groups. The initial adhesion of human fibroblast cells was studied on the described above plasma based Ar + beam modified and acrylic acid (AA) grafted or not fibronectin (FN) pre-coated or bare surfaces. The cell response seems to be related with the peculiar structure and wettability of the modified PDMS surface layer after plasma based Ar + beam treatment followed or not by AA grafting

  11. Surface Modeling, Grid Generation, and Related Issues in Computational Fluid Dynamic (CFD) Solutions

    Science.gov (United States)

    Choo, Yung K. (Compiler)

    1995-01-01

    The NASA Steering Committee for Surface Modeling and Grid Generation (SMAGG) sponsored a workshop on surface modeling, grid generation, and related issues in Computational Fluid Dynamics (CFD) solutions at Lewis Research Center, Cleveland, Ohio, May 9-11, 1995. The workshop provided a forum to identify industry needs, strengths, and weaknesses of the five grid technologies (patched structured, overset structured, Cartesian, unstructured, and hybrid), and to exchange thoughts about where each technology will be in 2 to 5 years. The workshop also provided opportunities for engineers and scientists to present new methods, approaches, and applications in SMAGG for CFD. This Conference Publication (CP) consists of papers on industry overview, NASA overview, five grid technologies, new methods/ approaches/applications, and software systems.

  12. Surface channeling

    International Nuclear Information System (INIS)

    Sizmann, R.; Varelas, C.

    1976-01-01

    There is experimental evidence that swift light ions incident at small angles towards single crystalline surfaces can lose an appreciable fraction of their kinetic energy during reflection. It is shown that these projectiles penetrate into the bulk surface region of the crystal. They can travel as channeled particles along long paths through the solid (surface channeling). The angular distribution and the depth history of the re-emerged projectiles are investigated by computer simulations. A considerable fraction of the penetrating projectiles re-emerges from the crystal with constant transverse energy if the angle of incidence is smaller than the critical angle for axial channeling. Analytical formulae are derived based on a diffusion model for surface channeling. A comparison with experimental data exhibits the relevance of the analytical solutions. (Auth.)

  13. Computer-based teaching is as good as face to face lecture-based teaching of evidence based medicine: a randomised controlled trial

    Science.gov (United States)

    2007-01-01

    Background At postgraduate level evidence based medicine (EBM) is currently taught through tutor based lectures. Computer based sessions fit around doctors' workloads, and standardise the quality of educational provision. There have been no randomized controlled trials comparing computer based sessions with traditional lectures at postgraduate level within medicine. Methods This was a randomised controlled trial involving six postgraduate education centres in the West Midlands, U.K. Fifty five newly qualified foundation year one doctors (U.S internship equivalent) were randomised to either computer based sessions or an equivalent lecture in EBM and systematic reviews. The change from pre to post-intervention score was measured using a validated questionnaire assessing knowledge (primary outcome) and attitudes (secondary outcome). Results Both groups were similar at baseline. Participants' improvement in knowledge in the computer based group was equivalent to the lecture based group (gain in score: 2.1 [S.D = 2.0] versus 1.9 [S.D = 2.4]; ANCOVA p = 0.078). Attitudinal gains were similar in both groups. Conclusion On the basis of our findings we feel computer based teaching and learning is as effective as typical lecture based teaching sessions for educating postgraduates in EBM and systematic reviews. PMID:17659076

  14. An endohedral fullerene-based nuclear spin quantum computer

    International Nuclear Information System (INIS)

    Ju Chenyong; Suter, Dieter; Du Jiangfeng

    2011-01-01

    We propose a new scalable quantum computer architecture based on endohedral fullerene molecules. Qubits are encoded in the nuclear spins of the endohedral atoms, which posses even longer coherence times than the electron spins which are used as the qubits in previous proposals. To address the individual qubits, we use the hyperfine interaction, which distinguishes two modes (active and passive) of the nuclear spin. Two-qubit quantum gates are effectively implemented by employing the electronic dipolar interaction between adjacent molecules. The electron spins also assist in the qubit initialization and readout. Our architecture should be significantly easier to implement than earlier proposals for spin-based quantum computers, such as the concept of Kane [B.E. Kane, Nature 393 (1998) 133]. - Research highlights: → We propose an endohedral fullerene-based scalable quantum computer architecture. → Qubits are encoded on nuclear spins, while electron spins serve as auxiliaries. → Nuclear spins are individually addressed using the hyperfine interaction. → Two-qubit gates are implemented through the medium of electron spins.

  15. Organization of the secure distributed computing based on multi-agent system

    Science.gov (United States)

    Khovanskov, Sergey; Rumyantsev, Konstantin; Khovanskova, Vera

    2018-04-01

    Nowadays developing methods for distributed computing is received much attention. One of the methods of distributed computing is using of multi-agent systems. The organization of distributed computing based on the conventional network computers can experience security threats performed by computational processes. Authors have developed the unified agent algorithm of control system of computing network nodes operation. Network PCs is used as computing nodes. The proposed multi-agent control system for the implementation of distributed computing allows in a short time to organize using of the processing power of computers any existing network to solve large-task by creating a distributed computing. Agents based on a computer network can: configure a distributed computing system; to distribute the computational load among computers operated agents; perform optimization distributed computing system according to the computing power of computers on the network. The number of computers connected to the network can be increased by connecting computers to the new computer system, which leads to an increase in overall processing power. Adding multi-agent system in the central agent increases the security of distributed computing. This organization of the distributed computing system reduces the problem solving time and increase fault tolerance (vitality) of computing processes in a changing computing environment (dynamic change of the number of computers on the network). Developed a multi-agent system detects cases of falsification of the results of a distributed system, which may lead to wrong decisions. In addition, the system checks and corrects wrong results.

  16. Navier-Stokes Computations of a Wing-Flap Model With Blowing Normal to the Flap Surface

    Science.gov (United States)

    Boyd, D. Douglas, Jr.

    2005-01-01

    A computational study of a generic wing with a half span flap shows the mean flow effects of several blown flap configurations. The effort compares and contrasts the thin-layer, Reynolds averaged, Navier-Stokes solutions of a baseline wing-flap configuration with configurations that have blowing normal to the flap surface through small slits near the flap side edge. Vorticity contours reveal a dual vortex structure at the flap side edge for all cases. The dual vortex merges into a single vortex at approximately the mid-flap chord location. Upper surface blowing reduces the strength of the merged vortex and moves the vortex away from the upper edge. Lower surface blowing thickens the lower shear layer and weakens the merged vortex, but not as much as upper surface blowing. Side surface blowing forces the lower surface vortex farther outboard of the flap edge by effectively increasing the aerodynamic span of the flap. It is seen that there is no global aerodynamic penalty or benefit from the particular blowing configurations examined.

  17. Computer-Based English Language Testing in China: Present and Future

    Science.gov (United States)

    Yu, Guoxing; Zhang, Jing

    2017-01-01

    In this special issue on high-stakes English language testing in China, the two articles on computer-based testing (Jin & Yan; He & Min) highlight a number of consistent, ongoing challenges and concerns in the development and implementation of the nationwide IB-CET (Internet Based College English Test) and institutional computer-adaptive…

  18. Development of a space radiation Monte Carlo computer simulation based on the FLUKA and ROOT codes

    CERN Document Server

    Pinsky, L; Ferrari, A; Sala, P; Carminati, F; Brun, R

    2001-01-01

    This NASA funded project is proceeding to develop a Monte Carlo-based computer simulation of the radiation environment in space. With actual funding only initially in place at the end of May 2000, the study is still in the early stage of development. The general tasks have been identified and personnel have been selected. The code to be assembled will be based upon two major existing software packages. The radiation transport simulation will be accomplished by updating the FLUKA Monte Carlo program, and the user interface will employ the ROOT software being developed at CERN. The end-product will be a Monte Carlo-based code which will complement the existing analytic codes such as BRYNTRN/HZETRN presently used by NASA to evaluate the effects of radiation shielding in space. The planned code will possess the ability to evaluate the radiation environment for spacecraft and habitats in Earth orbit, in interplanetary space, on the lunar surface, or on a planetary surface such as Mars. Furthermore, it will be usef...

  19. Bio-Inspired Functional Surfaces Based on Laser-Induced Periodic Surface Structures.

    Science.gov (United States)

    Müller, Frank A; Kunz, Clemens; Gräf, Stephan

    2016-06-15

    Nature developed numerous solutions to solve various technical problems related to material surfaces by combining the physico-chemical properties of a material with periodically aligned micro/nanostructures in a sophisticated manner. The utilization of ultra-short pulsed lasers allows mimicking numerous of these features by generating laser-induced periodic surface structures (LIPSS). In this review paper, we describe the physical background of LIPSS generation as well as the physical principles of surface related phenomena like wettability, reflectivity, and friction. Then we introduce several biological examples including e.g., lotus leafs, springtails, dessert beetles, moth eyes, butterfly wings, weevils, sharks, pangolins, and snakes to illustrate how nature solves technical problems, and we give a comprehensive overview of recent achievements related to the utilization of LIPSS to generate superhydrophobic, anti-reflective, colored, and drag resistant surfaces. Finally, we conclude with some future developments and perspectives related to forthcoming applications of LIPSS-based surfaces.

  20. Essential numerical computer methods

    CERN Document Server

    Johnson, Michael L

    2010-01-01

    The use of computers and computational methods has become ubiquitous in biological and biomedical research. During the last 2 decades most basic algorithms have not changed, but what has is the huge increase in computer speed and ease of use, along with the corresponding orders of magnitude decrease in cost. A general perception exists that the only applications of computers and computer methods in biological and biomedical research are either basic statistical analysis or the searching of DNA sequence data bases. While these are important applications they only scratch the surface of the current and potential applications of computers and computer methods in biomedical research. The various chapters within this volume include a wide variety of applications that extend far beyond this limited perception. As part of the Reliable Lab Solutions series, Essential Numerical Computer Methods brings together chapters from volumes 210, 240, 321, 383, 384, 454, and 467 of Methods in Enzymology. These chapters provide ...

  1. Computer-based Programs in Speech Therapy of Dyslalia and Dyslexia- Dysgraphia

    Directory of Open Access Journals (Sweden)

    Mirela Danubianu

    2010-04-01

    Full Text Available During the last years, the researchers and therapists in speech therapy have been more and more concerned with the elaboration and use of computer programs in speech disorders therapy. The main objective of this study was to evaluate the therapeutic effectiveness of computer-based programs for the Romanian language in speech therapy. Along the study, we will present the experimental research through assessing the effectiveness of computer programs in the speech therapy for speech disorders: dyslalia, dyslexia and dysgraphia. Methodologically, the use of the computer in the therapeutic phases was carried out with the help of some computer-based programs (Logomon, Dislex-Test etc. that we elaborated and we experimented with during several years of therapeutic activity. The sample used in our experiments was composed of 120 subjects; two groups of 60 children with speech disorders were selected for both speech disorders: 30 for the experimental ('computer-based' group and 30 for the control ('classical method' group. The study hypotheses verified whether the results, obtained by the subjects within the experimental group, improved significantly after using the computer-based program, compared to the subjects within the control group, who did not use this program but got a classical therapy. The hypotheses were confirmed for the speech disorders included in this research; the conclusions of the study confirm the advantages of using computer-based programs within speech therapy by correcting these disorders, as well as due to the positive influence these programs have on the development of children’s personality.

  2. Analysis of Computer Network Information Based on "Big Data"

    Science.gov (United States)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  3. Benefits of computer screen-based simulation in learning cardiac arrest procedures.

    Science.gov (United States)

    Bonnetain, Elodie; Boucheix, Jean-Michel; Hamet, Maël; Freysz, Marc

    2010-07-01

    What is the best way to train medical students early so that they acquire basic skills in cardiopulmonary resuscitation as effectively as possible? Studies have shown the benefits of high-fidelity patient simulators, but have also demonstrated their limits. New computer screen-based multimedia simulators have fewer constraints than high-fidelity patient simulators. In this area, as yet, there has been no research on the effectiveness of transfer of learning from a computer screen-based simulator to more realistic situations such as those encountered with high-fidelity patient simulators. We tested the benefits of learning cardiac arrest procedures using a multimedia computer screen-based simulator in 28 Year 2 medical students. Just before the end of the traditional resuscitation course, we compared two groups. An experiment group (EG) was first asked to learn to perform the appropriate procedures in a cardiac arrest scenario (CA1) in the computer screen-based learning environment and was then tested on a high-fidelity patient simulator in another cardiac arrest simulation (CA2). While the EG was learning to perform CA1 procedures in the computer screen-based learning environment, a control group (CG) actively continued to learn cardiac arrest procedures using practical exercises in a traditional class environment. Both groups were given the same amount of practice, exercises and trials. The CG was then also tested on the high-fidelity patient simulator for CA2, after which it was asked to perform CA1 using the computer screen-based simulator. Performances with both simulators were scored on a precise 23-point scale. On the test on a high-fidelity patient simulator, the EG trained with a multimedia computer screen-based simulator performed significantly better than the CG trained with traditional exercises and practice (16.21 versus 11.13 of 23 possible points, respectively; p<0.001). Computer screen-based simulation appears to be effective in preparing learners to

  4. Sulfide Mineral Surfaces

    International Nuclear Information System (INIS)

    Rosso, Kevin M.; Vaughan, David J.

    2006-01-01

    The past twenty years or so have seen dramatic development of the experimental and theoretical tools available to study the surfaces of solids at the molecular (?atomic resolution?) scale. On the experimental side, two areas of development well illustrate these advances. The first concerns the high intensity photon sources associated with synchrotron radiation; these have both greatly improved the surface sensitivity and spatial resolution of already established surface spectroscopic and diffraction methods, and enabled the development of new methods for studying surfaces. The second centers on the scanning probe microscopy (SPM) techniques initially developed in the 1980's with the first scanning tunneling microscope (STM) and atomic force microscope (AFM) experiments. The direct 'observation' of individual atoms at surfaces made possible with these methods has truly revolutionized surface science. On the theoretical side, the availability of high performance computers coupled with advances in computational modeling has provided powerful new tools to complement the advances in experiment. Particularly important have been the quantum mechanics based computational approaches such as density functional theory (DFT), which can now be easily used to calculate the equilibrium crystal structures of solids and surfaces from first principles, and to provide insights into their electronic structure. In this chapter, we review current knowledge of sulfide mineral surfaces, beginning with an overview of the principles relevant to the study of the surfaces of all crystalline solids. This includes the thermodynamics of surfaces, the atomic structure of surfaces (surface crystallography and structural stability, adjustments of atoms at the surface through relaxation or reconstruction, surface defects) and the electronic structure of surfaces. We then discuss examples where specific crystal surfaces have been studied, with the main sulfide minerals organized by structure type

  5. Sulfide Mineral Surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Rosso, Kevin M.; Vaughan, David J.

    2006-08-01

    The past twenty years or so have seen dramatic development of the experimental and theoretical tools available to study the surfaces of solids at the molecular (?atomic resolution?) scale. On the experimental side, two areas of development well illustrate these advances. The first concerns the high intensity photon sources associated with synchrotron radiation; these have both greatly improved the surface sensitivity and spatial resolution of already established surface spectroscopic and diffraction methods, and enabled the development of new methods for studying surfaces. The second centers on the scanning probe microscopy (SPM) techniques initially developed in the 1980's with the first scanning tunneling microscope (STM) and atomic force microscope (AFM) experiments. The direct 'observation' of individual atoms at surfaces made possible with these methods has truly revolutionized surface science. On the theoretical side, the availability of high performance computers coupled with advances in computational modeling has provided powerful new tools to complement the advances in experiment. Particularly important have been the quantum mechanics based computational approaches such as density functional theory (DFT), which can now be easily used to calculate the equilibrium crystal structures of solids and surfaces from first principles, and to provide insights into their electronic structure. In this chapter, we review current knowledge of sulfide mineral surfaces, beginning with an overview of the principles relevant to the study of the surfaces of all crystalline solids. This includes the thermodynamics of surfaces, the atomic structure of surfaces (surface crystallography and structural stability, adjustments of atoms at the surface through relaxation or reconstruction, surface defects) and the electronic structure of surfaces. We then discuss examples where specific crystal surfaces have been studied, with the main sulfide minerals organized by

  6. An airport surface surveillance solution based on fusion algorithm

    Science.gov (United States)

    Liu, Jianliang; Xu, Yang; Liang, Xuelin; Yang, Yihuang

    2017-01-01

    In this paper, we propose an airport surface surveillance solution combined with Multilateration (MLAT) and Automatic Dependent Surveillance Broadcast (ADS-B). The moving target to be monitored is regarded as a linear stochastic hybrid system moving freely and each surveillance technology is simplified as a sensor with white Gaussian noise. The dynamic model of target and the observation model of sensor are established in this paper. The measurements of sensors are filtered properly by estimators to get the estimation results for current time. Then, we analysis the characteristics of two fusion solutions proposed, and decide to use the scheme based on sensor estimation fusion for our surveillance solution. In the proposed fusion algorithm, according to the output of estimators, the estimation error is quantified, and the fusion weight of each sensor is calculated. The two estimation results are fused with weights, and the position estimation of target is computed accurately. Finally the proposed solution and algorithm are validated by an illustrative target tracking simulation.

  7. Computer-based theory of strategies

    Energy Technology Data Exchange (ETDEWEB)

    Findler, N V

    1983-01-01

    Some of the objectives and working tools of a new area of study, tentatively called theory of strategies, are described. It is based on the methodology of artificial intelligence, decision theory, operations research and digital gaming. The latter refers to computing activity that incorporates model building, simulation and learning programs in conflict situations. Three long-term projects which aim at automatically analyzing and synthesizing strategies are discussed. 27 references.

  8. A Novel UDT-Based Transfer Speed-Up Protocol for Fog Computing

    Directory of Open Access Journals (Sweden)

    Zhijie Han

    2018-01-01

    Full Text Available Fog computing is a distributed computing model as the middle layer between the cloud data center and the IoT device/sensor. It provides computing, network, and storage devices so that cloud based services can be closer to IOT devices and sensors. Cloud computing requires a lot of bandwidth, and the bandwidth of the wireless network is limited. In contrast, the amount of bandwidth required for “fog computing” is much less. In this paper, we improved a new protocol Peer Assistant UDT-Based Data Transfer Protocol (PaUDT, applied to Iot-Cloud computing. Furthermore, we compared the efficiency of the congestion control algorithm of UDT with the Adobe’s Secure Real-Time Media Flow Protocol (RTMFP, based on UDP completely at the transport layer. At last, we built an evaluation model of UDT in RTT and bit error ratio which describes the performance. The theoretical analysis and experiment result have shown that UDT has good performance in IoT-Cloud computing.

  9. Experimental and computational studies on a gasifier based stove

    International Nuclear Information System (INIS)

    Varunkumar, S.; Rajan, N.K.S.; Mukunda, H.S.

    2012-01-01

    Highlights: ► A simple method to calculate the fraction of HHC was devised. ► η g for stove is same as that of a downdraft gasifier. ► Gas from stove contains 5.5% of CH 4 equivalent of HHC. ► Effect of vessel size on utilization efficiency brought out clearly. ► Contribution of radiative heat transfer from char bed to efficiency is 6%. - Abstract: The work reported here is concerned with a detailed thermochemical evaluation of the flaming mode behaviour of a gasifier based stove. Determination of the gas composition over the fuel bed, surface and gas temperatures in the gasification process constitute principal experimental features. A simple atomic balance for the gasification reaction combined with the gas composition from the experiments is used to determine the CH 4 equivalent of higher hydrocarbons and the gasification efficiency (η g ). The components of utilization efficiency, namely, gasification–combustion and heat transfer are explored. Reactive flow computational studies using the measured gas composition over the fuel bed are used to simulate the thermochemical flow field and heat transfer to the vessel; hither-to-ignored vessel size effects in the extraction of heat from the stove are established clearly. The overall flaming mode efficiency of the stove is 50–54%; the convective and radiative components of heat transfer are established to be 45–47 and 5–7% respectively. The efficiency estimates from reacting computational fluid dynamics (RCFD) compare well with experiments.

  10. Minimally invasive registration for computer-assisted orthopedic surgery: combining tracked ultrasound and bone surface points via the P-IMLOP algorithm.

    Science.gov (United States)

    Billings, Seth; Kang, Hyun Jae; Cheng, Alexis; Boctor, Emad; Kazanzides, Peter; Taylor, Russell

    2015-06-01

    We present a registration method for computer-assisted total hip replacement (THR) surgery, which we demonstrate to improve the state of the art by both reducing the invasiveness of current methods and increasing registration accuracy. A critical element of computer-guided procedures is the determination of the spatial correspondence between the patient and a computational model of patient anatomy. The current method for establishing this correspondence in robot-assisted THR is to register points intraoperatively sampled by a tracked pointer from the exposed proximal femur and, via auxiliary incisions, from the distal femur. In this paper, we demonstrate a noninvasive technique for sampling points on the distal femur using tracked B-mode ultrasound imaging and present a new algorithm for registering these data called Projected Iterative Most-Likely Oriented Point (P-IMLOP). Points and normal orientations of the distal bone surface are segmented from ultrasound images and registered to the patient model along with points sampled from the exposed proximal femur via a tracked pointer. The proposed approach is evaluated using a bone- and tissue-mimicking leg phantom constructed to enable accurate assessment of experimental registration accuracy with respect to a CT-image-based model of the phantom. These experiments demonstrate that localization of the femur shaft is greatly improved by tracked ultrasound. The experiments further demonstrate that, for ultrasound-based data, the P-IMLOP algorithm significantly improves registration accuracy compared to the standard ICP algorithm. Registration via tracked ultrasound and the P-IMLOP algorithm has high potential to reduce the invasiveness and improve the registration accuracy of computer-assisted orthopedic procedures.

  11. Computational study of platinum nanoparticle deposition on the surfaces of crevices

    Energy Technology Data Exchange (ETDEWEB)

    Gu, H.F., E-mail: guhaifeng@hrbeu.edu.cn [Laboratory for Thermal-Hydraulics, Nuclear Energy and Safety Research Department, Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland); College of Nuclear Science and Technology, Harbin Engineering University, 150001 Harbin (China); Niceno, B. [Laboratory for Thermal-Hydraulics, Nuclear Energy and Safety Research Department, Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland); Grundler, P.V. [Laboratory for Nuclear Materials, Nuclear Energy and Safety Research Department, Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland); Sharabi, M. [Laboratory for Thermal-Hydraulics, Nuclear Energy and Safety Research Department, Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland); Mechanical Power Engineering Department, Mansoura University, 35516 Mansoura (Egypt); Veleva, L. [Laboratory for Nuclear Materials, Nuclear Energy and Safety Research Department, Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland); Hot Laboratory Division, Nuclear Energy and Safety Research Department, Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland); Ritter, S. [Laboratory for Nuclear Materials, Nuclear Energy and Safety Research Department, Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland)

    2016-08-01

    Highlights: • Nano-particle deposition on the surface of crevices is studied using RANS simulation. • Model results are validated by comparing with experimental data. • Behaviours and mechanisms of particle deposition in different crevices are analyzed. • RANS models with Lagrangian particle tracking method are evaluated and discussed. - Abstract: A well-known issue in boiling water reactors (BWR), which can threaten their structural integrity, is stress corrosion cracking (SCC) of reactor internals and recirculation pipes due to the accumulation of oxidizing radiolysis products of water. Currently, many operators of BWRs use combined platinum particle and hydrogen injection into the reactor water to mitigate SCC by lowering the electrochemical corrosion potential. It is essential for efficient mitigation that Pt particles reach all water-wetted surfaces, including crevices and cracks, which are also reached by the oxidizing species. In this study, a set of crevices with different widths and orientations with respect to the fluid flow are investigated using numerical simulation tools and compared against experimental findings. The Reynolds-Averaged Navier–Stokes models are used to compute the mean turbulent flow quantities in three-dimensional crevices, and the discrete random walk model is used to evaluate the effect of velocity fluctuations on particle movement. The Lagrangian particle tracking analysis is performed and the average concentration of deposited particles on the surface of crevices is evaluated and compared with experimental results. The results show that Reynolds stress model combined with enhanced wall treatment provides a more accurate prediction of particle concentration and distribution on the surface of crevices than SST k–ω turbulence model, which was expected, owing to the anisotropic nature of the Reynolds stress model. Furthermore, analyses on the particle deposition shows that three different mechanisms play important roles in

  12. Population of 224 realistic human subject-based computational breast phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, David W. [Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Wells, Jered R., E-mail: jered.wells@duke.edu [Clinical Imaging Physics Group and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Sturgeon, Gregory M. [Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 (United States); Samei, Ehsan [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Physics, Electrical and Computer Engineering, and Biomedical Engineering, and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Dobbins, James T. [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Physics and Biomedical Engineering and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Segars, W. Paul [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Lo, Joseph Y. [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Electrical and Computer Engineering and Biomedical Engineering and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States)

    2016-01-15

    Purpose: To create a database of highly realistic and anatomically variable 3D virtual breast phantoms based on dedicated breast computed tomography (bCT) data. Methods: A tissue classification and segmentation algorithm was used to create realistic and detailed 3D computational breast phantoms based on 230 + dedicated bCT datasets from normal human subjects. The breast volume was identified using a coarse three-class fuzzy C-means segmentation algorithm which accounted for and removed motion blur at the breast periphery. Noise in the bCT data was reduced through application of a postreconstruction 3D bilateral filter. A 3D adipose nonuniformity (bias field) correction was then applied followed by glandular segmentation using a 3D bias-corrected fuzzy C-means algorithm. Multiple tissue classes were defined including skin, adipose, and several fractional glandular densities. Following segmentation, a skin mask was produced which preserved the interdigitated skin, adipose, and glandular boundaries of the skin interior. Finally, surface modeling was used to produce digital phantoms with methods complementary to the XCAT suite of digital human phantoms. Results: After rejecting some datasets due to artifacts, 224 virtual breast phantoms were created which emulate the complex breast parenchyma of actual human subjects. The volume breast density (with skin) ranged from 5.5% to 66.3% with a mean value of 25.3% ± 13.2%. Breast volumes ranged from 25.0 to 2099.6 ml with a mean value of 716.3 ± 386.5 ml. Three breast phantoms were selected for imaging with digital compression (using finite element modeling) and simple ray-tracing, and the results show promise in their potential to produce realistic simulated mammograms. Conclusions: This work provides a new population of 224 breast phantoms based on in vivo bCT data for imaging research. Compared to previous studies based on only a few prototype cases, this dataset provides a rich source of new cases spanning a wide range

  13. Evaluation of cognitive loads imposed by traditional paper-based and innovative computer-based instructional strategies.

    Science.gov (United States)

    Khalil, Mohammed K; Mansour, Mahmoud M; Wilhite, Dewey R

    2010-01-01

    Strategies of presenting instructional information affect the type of cognitive load imposed on the learner's working memory. Effective instruction reduces extraneous (ineffective) cognitive load and promotes germane (effective) cognitive load. Eighty first-year students from two veterinary schools completed a two-section questionnaire that evaluated their perspectives on the educational value of a computer-based instructional program. They compared the difference between cognitive loads imposed by paper-based and computer-based instructional strategies used to teach the anatomy of the canine skeleton. Section I included 17 closed-ended items, rated on a five-point Likert scale, that assessed the use of graphics, content, and the learning process. Section II included a nine-point mental effort rating scale to measure the level of difficulty of instruction; students were asked to indicate the amount of mental effort invested in the learning task using both paper-based and computer-based presentation formats. The closed-ended data were expressed as means and standard deviations. A paired t test with an alpha level of 0.05 was used to determine the overall mean difference between the two presentation formats. Students positively evaluated their experience with the computer-based instructional program with a mean score of 4.69 (SD=0.53) for use of graphics, 4.70 (SD=0.56) for instructional content, and 4.45 (SD=0.67) for the learning process. The mean difference of mental effort (1.50) between the two presentation formats was significant, t=8.26, p≤.0001, df=76, for two-tailed distribution. Consistent with cognitive load theory, innovative computer-based instructional strategies decrease extraneous cognitive load compared with traditional paper-based instructional strategies.

  14. Bio-Inspired Functional Surfaces Based on Laser-Induced Periodic Surface Structures

    Directory of Open Access Journals (Sweden)

    Frank A. Müller

    2016-06-01

    Full Text Available Nature developed numerous solutions to solve various technical problems related to material surfaces by combining the physico-chemical properties of a material with periodically aligned micro/nanostructures in a sophisticated manner. The utilization of ultra-short pulsed lasers allows mimicking numerous of these features by generating laser-induced periodic surface structures (LIPSS. In this review paper, we describe the physical background of LIPSS generation as well as the physical principles of surface related phenomena like wettability, reflectivity, and friction. Then we introduce several biological examples including e.g., lotus leafs, springtails, dessert beetles, moth eyes, butterfly wings, weevils, sharks, pangolins, and snakes to illustrate how nature solves technical problems, and we give a comprehensive overview of recent achievements related to the utilization of LIPSS to generate superhydrophobic, anti-reflective, colored, and drag resistant surfaces. Finally, we conclude with some future developments and perspectives related to forthcoming applications of LIPSS-based surfaces.

  15. All-optical reservoir computer based on saturation of absorption.

    Science.gov (United States)

    Dejonckheere, Antoine; Duport, François; Smerieri, Anteo; Fang, Li; Oudar, Jean-Louis; Haelterman, Marc; Massar, Serge

    2014-05-05

    Reservoir computing is a new bio-inspired computation paradigm. It exploits a dynamical system driven by a time-dependent input to carry out computation. For efficient information processing, only a few parameters of the reservoir needs to be tuned, which makes it a promising framework for hardware implementation. Recently, electronic, opto-electronic and all-optical experimental reservoir computers were reported. In those implementations, the nonlinear response of the reservoir is provided by active devices such as optoelectronic modulators or optical amplifiers. By contrast, we propose here the first reservoir computer based on a fully passive nonlinearity, namely the saturable absorption of a semiconductor mirror. Our experimental setup constitutes an important step towards the development of ultrafast low-consumption analog computers.

  16. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    OpenAIRE

    Pirouz Nourian; Carlos Martinez-Ortiz; Ken Arroyo Ohori

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential ...

  17. The skill of surface registration in CT-based navigation system for total hip arthroplasty

    International Nuclear Information System (INIS)

    Hananouchi, T.; Sugano, N.; Nishii, T.; Miki, H.; Sakai, T.; Yoshikawa, H.; Iwana, D.; Yamamura, M.; Nakamura, N.

    2007-01-01

    Surface registration of the CT-based navigation system, which is a matching between computational and real spatial spaces, is a key step to guarantee the accuracy of navigation. However, it has not been well described how the accuracy is affected by the registration skill of surgeon. Here, we reported the difference of the registration error between eight surgeons with the experience of navigation and six apprentice surgeons. A cadaveric pelvic model with an acetabular cup was made to measure the skill and learning curve of registration. After surface registration, two cup angles (inclination and anteversion) were recorded in the navigation system and the variance of these cup angles in ten trials were compared between the experienced surgeons and apprentices. In addition, we investigated whether the accuracy of registration by the apprentices was improved by visual information on how to take the surface points. The results showed that there was statistically significant difference in the accuracy of registration between the two groups. The accuracy of the second ten trials after getting the visual information showed great improvements. (orig.)

  18. Computer-Game-Based Tutoring of Mathematics

    Science.gov (United States)

    Ke, Fengfeng

    2013-01-01

    This in-situ, descriptive case study examined the potential of implementing computer mathematics games as an anchor for tutoring of mathematics. Data were collected from middle school students at a rural pueblo school and an urban Hispanic-serving school, through in-field observation, content analysis of game-based tutoring-learning interactions,…

  19. Computer-Based Testing: Test Site Security.

    Science.gov (United States)

    Rosen, Gerald A.

    Computer-based testing places great burdens on all involved parties to ensure test security. A task analysis of test site security might identify the areas of protecting the test, protecting the data, and protecting the environment as essential issues in test security. Protecting the test involves transmission of the examinations, identifying the…

  20. Solving the incompressible surface Navier-Stokes equation by surface finite elements

    Science.gov (United States)

    Reuther, Sebastian; Voigt, Axel

    2018-01-01

    We consider a numerical approach for the incompressible surface Navier-Stokes equation on surfaces with arbitrary genus g (S ) . The approach is based on a reformulation of the equation in Cartesian coordinates of the embedding R3, penalization of the normal component, a Chorin projection method, and discretization in space by surface finite elements for each component. The approach thus requires only standard ingredients which most finite element implementations can offer. We compare computational results with discrete exterior calculus simulations on a torus and demonstrate the interplay of the flow field with the topology by showing realizations of the Poincaré-Hopf theorem on n-tori.

  1. Optimization and large scale computation of an entropy-based moment closure

    Science.gov (United States)

    Kristopher Garrett, C.; Hauck, Cory; Hill, Judith

    2015-12-01

    We present computational advances and results in the implementation of an entropy-based moment closure, MN, in the context of linear kinetic equations, with an emphasis on heterogeneous and large-scale computing platforms. Entropy-based closures are known in several cases to yield more accurate results than closures based on standard spectral approximations, such as PN, but the computational cost is generally much higher and often prohibitive. Several optimizations are introduced to improve the performance of entropy-based algorithms over previous implementations. These optimizations include the use of GPU acceleration and the exploitation of the mathematical properties of spherical harmonics, which are used as test functions in the moment formulation. To test the emerging high-performance computing paradigm of communication bound simulations, we present timing results at the largest computational scales currently available. These results show, in particular, load balancing issues in scaling the MN algorithm that do not appear for the PN algorithm. We also observe that in weak scaling tests, the ratio in time to solution of MN to PN decreases.

  2. Memristor-Based Synapse Design and Training Scheme for Neuromorphic Computing Architecture

    Science.gov (United States)

    2012-06-01

    system level built upon the conventional Von Neumann computer architecture [2][3]. Developing the neuromorphic architecture at chip level by...SCHEME FOR NEUROMORPHIC COMPUTING ARCHITECTURE 5a. CONTRACT NUMBER FA8750-11-2-0046 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 62788F 6...creation of memristor-based neuromorphic computing architecture. Rather than the existing crossbar-based neuron network designs, we focus on memristor

  3. A Nuclear Safety System based on Industrial Computer

    International Nuclear Information System (INIS)

    Kim, Ji Hyeon; Oh, Do Young; Lee, Nam Hoon; Kim, Chang Ho; Kim, Jae Hack

    2011-01-01

    The Plant Protection System(PPS), a nuclear safety Instrumentation and Control (I and C) system for Nuclear Power Plants(NPPs), generates reactor trip on abnormal reactor condition. The Core Protection Calculator System (CPCS) is a safety system that generates and transmits the channel trip signal to the PPS on an abnormal condition. Currently, these systems are designed on the Programmable Logic Controller(PLC) based system and it is necessary to consider a new system platform to adapt simpler system configuration and improved software development process. The CPCS was the first implementation using a micro computer in a nuclear power plant safety protection system in 1980 which have been deployed in Ulchin units 3,4,5,6 and Younggwang units 3,4,5,6. The CPCS software was developed in the Concurrent Micro5 minicomputer using assembly language and embedded into the Concurrent 3205 computer. Following the micro computer based CPCS, PLC based Common-Q platform has been used for the ShinKori/ShinWolsong units 1,2 PPS and CPCS, and the POSAFE-Q PLC platform is used for the ShinUlchin units 1,2 PPS and CPCS. In developing the next generation safety system platform, several factors (e.g., hardware/software reliability, flexibility, licensibility and industrial support) can be considered. This paper suggests an Industrial Computer(IC) based protection system that can be developed with improved flexibility without losing system reliability. The IC based system has the advantage of a simple system configuration with optimized processor boards because of improved processor performance and unlimited interoperability between the target system and development system that use commercial CASE tools. This paper presents the background to selecting the IC based system with a case study design of the CPCS. Eventually, this kind of platform can be used for nuclear power plant safety systems like the PPS, CPCS, Qualified Indication and Alarm . Pami(QIAS-P), and Engineering Safety

  4. A Nuclear Safety System based on Industrial Computer

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ji Hyeon; Oh, Do Young; Lee, Nam Hoon; Kim, Chang Ho; Kim, Jae Hack [Korea Electric Power Corporation Engineering and Construction, Daejeon (Korea, Republic of)

    2011-05-15

    The Plant Protection System(PPS), a nuclear safety Instrumentation and Control (I and C) system for Nuclear Power Plants(NPPs), generates reactor trip on abnormal reactor condition. The Core Protection Calculator System (CPCS) is a safety system that generates and transmits the channel trip signal to the PPS on an abnormal condition. Currently, these systems are designed on the Programmable Logic Controller(PLC) based system and it is necessary to consider a new system platform to adapt simpler system configuration and improved software development process. The CPCS was the first implementation using a micro computer in a nuclear power plant safety protection system in 1980 which have been deployed in Ulchin units 3,4,5,6 and Younggwang units 3,4,5,6. The CPCS software was developed in the Concurrent Micro5 minicomputer using assembly language and embedded into the Concurrent 3205 computer. Following the micro computer based CPCS, PLC based Common-Q platform has been used for the ShinKori/ShinWolsong units 1,2 PPS and CPCS, and the POSAFE-Q PLC platform is used for the ShinUlchin units 1,2 PPS and CPCS. In developing the next generation safety system platform, several factors (e.g., hardware/software reliability, flexibility, licensibility and industrial support) can be considered. This paper suggests an Industrial Computer(IC) based protection system that can be developed with improved flexibility without losing system reliability. The IC based system has the advantage of a simple system configuration with optimized processor boards because of improved processor performance and unlimited interoperability between the target system and development system that use commercial CASE tools. This paper presents the background to selecting the IC based system with a case study design of the CPCS. Eventually, this kind of platform can be used for nuclear power plant safety systems like the PPS, CPCS, Qualified Indication and Alarm . Pami(QIAS-P), and Engineering Safety

  5. Reciprocal Questioning and Computer-based Instruction in Introductory Auditing: Student Perceptions.

    Science.gov (United States)

    Watters, Mike

    2000-01-01

    An auditing course used reciprocal questioning (Socratic method) and computer-based instruction. Separate evaluations by 67 students revealed a strong aversion to the Socratic method; students expected professors to lecture. They showed a strong preference for the computer-based assignment. (SK)

  6. Nanophotonic quantum computer based on atomic quantum transistor

    International Nuclear Information System (INIS)

    Andrianov, S N; Moiseev, S A

    2015-01-01

    We propose a scheme of a quantum computer based on nanophotonic elements: two buses in the form of nanowaveguide resonators, two nanosized units of multiatom multiqubit quantum memory and a set of nanoprocessors in the form of photonic quantum transistors, each containing a pair of nanowaveguide ring resonators coupled via a quantum dot. The operation modes of nanoprocessor photonic quantum transistors are theoretically studied and the execution of main logical operations by means of them is demonstrated. We also discuss the prospects of the proposed nanophotonic quantum computer for operating in high-speed optical fibre networks. (quantum computations)

  7. Morphing-Based Shape Optimization in Computational Fluid Dynamics

    Science.gov (United States)

    Rousseau, Yannick; Men'Shov, Igor; Nakamura, Yoshiaki

    In this paper, a Morphing-based Shape Optimization (MbSO) technique is presented for solving Optimum-Shape Design (OSD) problems in Computational Fluid Dynamics (CFD). The proposed method couples Free-Form Deformation (FFD) and Evolutionary Computation, and, as its name suggests, relies on the morphing of shape and computational domain, rather than direct shape parameterization. Advantages of the FFD approach compared to traditional parameterization are first discussed. Then, examples of shape and grid deformations by FFD are presented. Finally, the MbSO approach is illustrated and applied through an example: the design of an airfoil for a future Mars exploration airplane.

  8. Nanophotonic quantum computer based on atomic quantum transistor

    Energy Technology Data Exchange (ETDEWEB)

    Andrianov, S N [Institute of Advanced Research, Academy of Sciences of the Republic of Tatarstan, Kazan (Russian Federation); Moiseev, S A [Kazan E. K. Zavoisky Physical-Technical Institute, Kazan Scientific Center, Russian Academy of Sciences, Kazan (Russian Federation)

    2015-10-31

    We propose a scheme of a quantum computer based on nanophotonic elements: two buses in the form of nanowaveguide resonators, two nanosized units of multiatom multiqubit quantum memory and a set of nanoprocessors in the form of photonic quantum transistors, each containing a pair of nanowaveguide ring resonators coupled via a quantum dot. The operation modes of nanoprocessor photonic quantum transistors are theoretically studied and the execution of main logical operations by means of them is demonstrated. We also discuss the prospects of the proposed nanophotonic quantum computer for operating in high-speed optical fibre networks. (quantum computations)

  9. Smooth surfaces from rational bilinear patches

    KAUST Repository

    Shi, Ling

    2014-01-01

    Smooth freeform skins from simple panels constitute a challenging topic arising in contemporary architecture. We contribute to this problem area by showing how to approximate a negatively curved surface by smoothly joined rational bilinear patches. The approximation problem is solved with help of a new computational approach to the hyperbolic nets of Huhnen-Venedey and Rörig and optimization algorithms based on it. We also discuss its limits which lie in the topology of the input surface. Finally, freeform deformations based on Darboux transformations are used to generate smooth surfaces from smoothly joined Darboux cyclide patches; in this way we eliminate the restriction to surfaces with negative Gaussian curvature. © 2013 Elsevier B.V.

  10. Touch-based Brain Computer Interfaces: State of the art

    NARCIS (Netherlands)

    Erp, J.B.F. van; Brouwer, A.M.

    2014-01-01

    Brain Computer Interfaces (BCIs) rely on the user's brain activity to control equipment or computer devices. Many BCIs are based on imagined movement (called active BCIs) or the fact that brain patterns differ in reaction to relevant or attended stimuli in comparison to irrelevant or unattended

  11. Computer Assisted Project-Based Instruction: The Effects on Science Achievement, Computer Achievement and Portfolio Assessment

    Science.gov (United States)

    Erdogan, Yavuz; Dede, Dinçer

    2015-01-01

    The purpose of this study is to compare the effects of computer assisted project-based instruction on learners' achievement in a science and technology course, in a computer course and in portfolio development. With this aim in mind, a quasi-experimental design was used and a sample of 70 seventh grade secondary school students from Org. Esref…

  12. Feasibility of Ultrasound-Based Computational Fluid Dynamics as a Mitral Valve Regurgitation Quantification Technique: Comparison with 2-D and 3-D Proximal Isovelocity Surface Area-Based Methods.

    Science.gov (United States)

    Jamil, Muhammad; Ahmad, Omar; Poh, Kian Keong; Yap, Choon Hwai

    2017-07-01

    Current Doppler echocardiography quantification of mitral regurgitation (MR) severity has shortcomings. Proximal isovelocity surface area (PISA)-based methods, for example, are unable to account for the fact that ultrasound Doppler can measure only one velocity component: toward or away from the transducer. In the present study, we used ultrasound-based computational fluid dynamics (Ub-CFD) to quantify mitral regurgitation and study its advantages and disadvantages compared with 2-D and 3-D PISA methods. For Ub-CFD, patient-specific mitral valve geometry and velocity data were obtained from clinical ultrasound followed by 3-D CFD simulations at an assumed flow rate. We then obtained the average ratio of the ultrasound Doppler velocities to CFD velocities in the flow convergence region, and scaled CFD flow rate with this ratio as the final measured flow rate. We evaluated Ub-CFD, 2-D PISA and 3-D PISA with an in vitro flow loop, which featured regurgitation flow through (i) a simplified flat plate with round orifice and (ii) a 3-D printed realistic mitral valve and regurgitation orifice. The Ub-CFD and 3-D PISA methods had higher precision than the 2-D PISA method. Ub-CFD had consistent accuracy under all conditions tested, whereas 2-D PISA had the lowest overall accuracy. In vitro investigations indicated that the accuracy of 2-D and 3-D PISA depended significantly on the choice of aliasing velocity. Evaluation of these techniques was also performed for two clinical cases, and the dependency of PISA on aliasing velocity was similarly observed. Ub-CFD was robustly accurate and precise and has promise for future translation to clinical practice. Copyright © 2017 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  13. On the Computation of Comprehensive Boolean Gröbner Bases

    Science.gov (United States)

    Inoue, Shutaro

    We show that a comprehensive Boolean Gröbner basis of an ideal I in a Boolean polynomial ring B (bar A,bar X) with main variables bar X and parameters bar A can be obtained by simply computing a usual Boolean Gröbner basis of I regarding both bar X and bar A as variables with a certain block term order such that bar X ≫ bar A. The result together with a fact that a finite Boolean ring is isomorphic to a direct product of the Galois field mathbb{GF}_2 enables us to compute a comprehensive Boolean Gröbner basis by only computing corresponding Gröbner bases in a polynomial ring over mathbb{GF}_2. Our implementation in a computer algebra system Risa/Asir shows that our method is extremely efficient comparing with existing computation algorithms of comprehensive Boolean Gröbner bases.

  14. Computer Animation Based on Particle Methods

    Directory of Open Access Journals (Sweden)

    Rafal Wcislo

    1999-01-01

    Full Text Available The paper presents the main issues of a computer animation of a set of elastic macroscopic objects based on the particle method. The main assumption of the generated animations is to achieve very realistic movements in a scene observed on the computer display. The objects (solid bodies interact mechanically with each other, The movements and deformations of solids are calculated using the particle method. Phenomena connected with the behaviour of solids in the gravitational field, their defomtations caused by collisions and interactions with the optional liquid medium are simulated. The simulation ofthe liquid is performed using the cellular automata method. The paper presents both simulation schemes (particle method and cellular automata rules an the method of combining them in the single animation program. ln order to speed up the execution of the program the parallel version based on the network of workstation was developed. The paper describes the methods of the parallelization and it considers problems of load-balancing, collision detection, process synchronization and distributed control of the animation.

  15. Testing photogrammetry-based techniques for three-dimensional surface documentation in forensic pathology.

    Science.gov (United States)

    Urbanová, Petra; Hejna, Petr; Jurda, Mikoláš

    2015-05-01

    Three-dimensional surface technologies particularly close range photogrammetry and optical surface scanning have recently advanced into affordable, flexible and accurate techniques. Forensic postmortem investigation as performed on a daily basis, however, has not yet fully benefited from their potentials. In the present paper, we tested two approaches to 3D external body documentation - digital camera-based photogrammetry combined with commercial Agisoft PhotoScan(®) software and stereophotogrammetry-based Vectra H1(®), a portable handheld surface scanner. In order to conduct the study three human subjects were selected, a living person, a 25-year-old female, and two forensic cases admitted for postmortem examination at the Department of Forensic Medicine, Hradec Králové, Czech Republic (both 63-year-old males), one dead to traumatic, self-inflicted, injuries (suicide by hanging), the other diagnosed with the heart failure. All three cases were photographed in 360° manner with a Nikon 7000 digital camera and simultaneously documented with the handheld scanner. In addition to having recorded the pre-autopsy phase of the forensic cases, both techniques were employed in various stages of autopsy. The sets of collected digital images (approximately 100 per case) were further processed to generate point clouds and 3D meshes. Final 3D models (a pair per individual) were counted for numbers of points and polygons, then assessed visually and compared quantitatively using ICP alignment algorithm and a cloud point comparison technique based on closest point to point distances. Both techniques were proven to be easy to handle and equally laborious. While collecting the images at autopsy took around 20min, the post-processing was much more time-demanding and required up to 10h of computation time. Moreover, for the full-body scanning the post-processing of the handheld scanner required rather time-consuming manual image alignment. In all instances the applied approaches

  16. Low computation vision-based navigation for a Martian rover

    Science.gov (United States)

    Gavin, Andrew S.; Brooks, Rodney A.

    1994-01-01

    Construction and design details of the Mobot Vision System, a small, self-contained, mobile vision system, are presented. This system uses the view from the top of a small, roving, robotic vehicle to supply data that is processed in real-time to safely navigate the surface of Mars. A simple, low-computation algorithm for constructing a 3-D navigational map of the Martian environment to be used by the rover is discussed.

  17. Using computer-based training to facilitate radiation protection review

    International Nuclear Information System (INIS)

    Abercrombie, J.S.; Copenhaver, E.D.

    1989-01-01

    In a national laboratory setting, it is necessary to provide radiation protection overview and training to diverse parts of the laboratory population. This includes employees at research reactors, accelerators, waste facilities, radiochemical isotope processing, and analytical laboratories, among others. In addition, our own radiation protection and monitoring staffs must be trained. To assist in the implementation of this full range of training, ORNL has purchased prepackaged computer-based training in health physics and technical mathematics with training modules that can be selected from many topics. By selection of specific modules, appropriate radiation protection review packages can be determined to meet many individual program needs. Because our radiation protection personnel must have some previous radiation protection experience or the equivalent of an associate's degree in radiation protection for entry level, the computer-based training will serve primarily as review of major principles. Others may need very specific prior training to make the computer-based training effective in their work situations. 4 refs

  18. Hierarchical surface code for network quantum computing with modules of arbitrary size

    Science.gov (United States)

    Li, Ying; Benjamin, Simon C.

    2016-10-01

    The network paradigm for quantum computing involves interconnecting many modules to form a scalable machine. Typically it is assumed that the links between modules are prone to noise while operations within modules have a significantly higher fidelity. To optimize fault tolerance in such architectures we introduce a hierarchical generalization of the surface code: a small "patch" of the code exists within each module and constitutes a single effective qubit of the logic-level surface code. Errors primarily occur in a two-dimensional subspace, i.e., patch perimeters extruded over time, and the resulting noise threshold for intermodule links can exceed ˜10 % even in the absence of purification. Increasing the number of qubits within each module decreases the number of qubits necessary for encoding a logical qubit. But this advantage is relatively modest, and broadly speaking, a "fine-grained" network of small modules containing only about eight qubits is competitive in total qubit count versus a "course" network with modules containing many hundreds of qubits.

  19. Computed tomography of the ''armored brain''

    International Nuclear Information System (INIS)

    Ludwig, B.; Nix, W.; Lanksch, W.

    1983-01-01

    A classified chronic subdural hematoma may cover the surface of the cerebral hemispheres to such an extent that one can talk of an ''armored brain''. Pathogenesis, clinical course and treatment are discussed based on the computed tomograms of five cases. (orig.)

  20. Tile-based rigidization surface parametric design study

    Science.gov (United States)

    Giner Munoz, Laura; Luntz, Jonathan; Brei, Diann; Kim, Wonhee

    2018-03-01

    Inflatable technologies have proven useful in consumer goods as well as in more recent applications including civil structures, aerospace, medical, and robotics. However, inflatable technologies are typically lacking in their ability to provide rigid structural support. Particle jamming improves upon this by providing structures which are normally flexible and moldable but become rigid when air is removed. Because these are based on an airtight bladder filled with loose particles, they always occupy the full volume of its rigid state, even when not rigidized. More recent developments in layer jamming have created thin, compact rigidizing surfaces replacing the loose volume of particles with thinly layered surface materials. Work in this area has been applied to several specific applications with positive results but have not generally provided the broader understanding of the rigidization performance as a function of design parameters required for directly adapting layer rigidization technology to other applications. This paper presents a parametric design study of a new layer jamming vacuum rigidization architecture: tile-based vacuum rigidization. This form of rigidization is based on layers of tiles contained within a thin vacuum bladder which can be bent, rolled, or otherwise compactly stowed, but when deployed flat, can be vacuumed and form a large, flat, rigid plate capable of supporting large forces both localized and distributed over the surface. The general architecture and operation detailing rigidization and compliance mechanisms is introduced. To quantitatively characterize the rigidization behavior, prototypes rigidization surfaces are fabricated and an experimental technique is developed based on a 3-point bending test. Performance evaluation metrics are developed to describe the stiffness, load-bearing capacity, and internal slippage of tested prototypes. A set of experimental parametric studies are performed to better understand the impact of

  1. Reconstruction of sub-surface archaeological remains from magnetic data using neural computing.

    Science.gov (United States)

    Bescoby, D. J.; Cawley, G. C.; Chroston, P. N.

    2003-04-01

    The remains of a former Roman colonial settlement, once part of the classical city of Butrint in southern Albania have been the subject of a high resolution magnetic survey using a caesium-vapour magnetometer. The survey revealed the surviving remains of an extensive planned settlement and a number of outlying buildings, today buried beneath over 0.5 m of alluvial deposits. The aim of the current research is to derive a sub-surface model from the magnetic survey measurements, allowing an enhanced archaeological interpretation of the data. Neural computing techniques are used to perform the non-linear mapping between magnetic data and corresponding sub-surface model parameters. The adoption of neural computing paradigms potentially holds several advantages over other modelling techniques, allowing fast solutions for complex data, while having a high tolerance to noise. A multi-layer perceptron network with a feed-forward architecture is trained to estimate the shape and burial depth of wall foundations using a series of representative models as training data. Parameters used to forward model the training data sets are derived from a number of trial trench excavations targeted over features identified by the magnetic survey. The training of the network was optimized by first applying it to synthetic test data of known source parameters. Pre-processing of the network input data, including the use of a rotationally invariant transform, enhanced network performance and the efficiency of the training data. The approach provides good results when applied to real magnetic data, accurately predicting the depths and layout of wall foundations within the former settlement, verified by subsequent excavation. The resulting sub-surface model is derived from the averaged outputs of a ‘committee’ of five networks, trained with individualized training sets. Fuzzy logic inference has also been used to combine individual network outputs through correlation with data from a second

  2. Computer Texture Mapping for Laser Texturing of Injection Mold

    Directory of Open Access Journals (Sweden)

    Yongquan Zhou

    2014-04-01

    Full Text Available Laser texturing is a relatively new multiprocess technique that has been used for machining 3D curved surfaces; it is more flexible and efficient to create decorative texture on 3D curved surfaces of injection molds so as to improve the surface quality and achieve cosmetic surface of molded plastic parts. In this paper, a novel method of laser texturing 3D curved surface based on 3-axis galvanometer scanning unit has been presented to prevent the texturing of injection mold surface from much distortion which is often caused by traditional texturing processes. The novel method has been based on the computer texture mapping technology which has been developed and presented. The developed texture mapping algorithm includes surface triangulation, notations, distortion measurement, control, and numerical method. An interface of computer texture mapping has been built to implement the algorithm of texture mapping approach to controlled distortion rate of 3D texture math model from 2D original texture applied to curvature surface. Through a case study of laser texturing of a high curvature surface of injection mold of a mice top case, it shows that the novel method of laser texturing meets the quality standard of laser texturing of injection mold.

  3. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  4. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  5. Projection computation based on pixel in simultaneous algebraic reconstruction technique

    International Nuclear Information System (INIS)

    Wang Xu; Chen Zhiqiang; Xiong Hua; Zhang Li

    2005-01-01

    SART is an important arithmetic of image reconstruction, in which the projection computation takes over half of the reconstruction time. An efficient way to compute projection coefficient matrix together with memory optimization is presented in this paper. Different from normal method, projection lines are located based on every pixel, and the following projection coefficient computation can make use of the results. Correlation of projection lines and pixels can be used to optimize the computation. (authors)

  6. Spaceborne computer executive routine functional design specification. Volume 2: Computer executive design for space station/base

    Science.gov (United States)

    Kennedy, J. R.; Fitzpatrick, W. S.

    1971-01-01

    The computer executive functional system design concepts derived from study of the Space Station/Base are presented. Information Management System hardware configuration as directly influencing the executive design is reviewed. The hardware configuration and generic executive design requirements are considered in detail in a previous report (System Configuration and Executive Requirements Specifications for Reusable Shuttle and Space Station/Base, 9/25/70). This report defines basic system primitives and delineates processes and process control. Supervisor states are considered for describing basic multiprogramming and multiprocessing systems. A high-level computer executive including control of scheduling, allocation of resources, system interactions, and real-time supervisory functions is defined. The description is oriented to provide a baseline for a functional simulation of the computer executive system.

  7. Computer-Based Simulation Games in Public Administration Education

    OpenAIRE

    Kutergina Evgeniia

    2017-01-01

    Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently...

  8. Computational neuroanatomy: ontology-based representation of neural components and connectivity.

    Science.gov (United States)

    Rubin, Daniel L; Talos, Ion-Florin; Halle, Michael; Musen, Mark A; Kikinis, Ron

    2009-02-05

    A critical challenge in neuroscience is organizing, managing, and accessing the explosion in neuroscientific knowledge, particularly anatomic knowledge. We believe that explicit knowledge-based approaches to make neuroscientific knowledge computationally accessible will be helpful in tackling this challenge and will enable a variety of applications exploiting this knowledge, such as surgical planning. We developed ontology-based models of neuroanatomy to enable symbolic lookup, logical inference and mathematical modeling of neural systems. We built a prototype model of the motor system that integrates descriptive anatomic and qualitative functional neuroanatomical knowledge. In addition to modeling normal neuroanatomy, our approach provides an explicit representation of abnormal neural connectivity in disease states, such as common movement disorders. The ontology-based representation encodes both structural and functional aspects of neuroanatomy. The ontology-based models can be evaluated computationally, enabling development of automated computer reasoning applications. Neuroanatomical knowledge can be represented in machine-accessible format using ontologies. Computational neuroanatomical approaches such as described in this work could become a key tool in translational informatics, leading to decision support applications that inform and guide surgical planning and personalized care for neurological disease in the future.

  9. Quantitative image analysis of vertebral body architecture - improved diagnosis in osteoporosis based on high-resolution computed tomography

    International Nuclear Information System (INIS)

    Mundinger, A.; Wiesmeier, B.; Dinkel, E.; Helwig, A.; Beck, A.; Schulte Moenting, J.

    1993-01-01

    71 women, 64 post-menopausal, were examined by single-energy quantitative computed tomography (SEQCT) and by high-resolution computed tomography (HRCT) scans through the middle of lumbar vertebral bodies. Computer-assisted image analysis of the high-resolution images assessed trabecular morphometry of the vertebral spongiosa texture. Texture parameters differed in women with and without age-reduced bone density, and in the former group also in patients with and without vertebral fractures. Discriminating parameters were the total number, diameter and variance of trabecular and intertrabecular spaces as well as the trabecular surface (p < 0.05)). A texture index based on these statistically selected morphometric parameters identified a subgroup of patients suffering from fractures due to abnormal spongiosal architecture but with a bone mineral content not indicative for increased fracture risk. The combination of osteodensitometric and trabecular morphometry improves the diagnosis of osteoporosis and may contribute to the prediction of individual fracture risk. (author)

  10. Feature-based handling of surface faults in compact disc players

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob; Andersen, Palle

    2006-01-01

    In this paper a novel method called feature-based control is presented. The method is designed to improve compact disc players’ handling of surface faults on the discs. The method is based on a fault-tolerant control scheme, which uses extracted features of the surface faults to remove those from...... the detector signals used for control during the occurrence of surface faults. The extracted features are coefficients of Karhunen–Loève approximations of the surface faults. The performance of the feature-based control scheme controlling compact disc players playing discs with surface faults has been...... validated experimentally. The proposed scheme reduces the control errors due to the surface faults, and in some cases where the standard fault handling scheme fails, our scheme keeps the CD-player playing....

  11. ISAT promises fail-safe computer-based reactor protection

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    AEA Technology's ISAT system is a multiplexed microprocessor-based reactor protection system which has very extensive self-monitoring capabilities and is inherently fail safe. It provides a way of addressing software reliability problems that have tended to hamper widespread introduction of computer-based reactor protection. (author)

  12. Concepts and Relations in Neurally Inspired In Situ Concept-Based Computing.

    Science.gov (United States)

    van der Velde, Frank

    2016-01-01

    In situ concept-based computing is based on the notion that conceptual representations in the human brain are "in situ." In this way, they are grounded in perception and action. Examples are neuronal assemblies, whose connection structures develop over time and are distributed over different brain areas. In situ concepts representations cannot be copied or duplicated because that will disrupt their connection structure, and thus the meaning of these concepts. Higher-level cognitive processes, as found in language and reasoning, can be performed with in situ concepts by embedding them in specialized neurally inspired "blackboards." The interactions between the in situ concepts and the blackboards form the basis for in situ concept computing architectures. In these architectures, memory (concepts) and processing are interwoven, in contrast with the separation between memory and processing found in Von Neumann architectures. Because the further development of Von Neumann computing (more, faster, yet power limited) is questionable, in situ concept computing might be an alternative for concept-based computing. In situ concept computing will be illustrated with a recently developed BABI reasoning task. Neurorobotics can play an important role in the development of in situ concept computing because of the development of in situ concept representations derived in scenarios as needed for reasoning tasks. Neurorobotics would also benefit from power limited and in situ concept computing.

  13. Projection of curves on B-spline surfaces using quadratic reparameterization

    KAUST Repository

    Yang, Yijun; Zeng, Wei; Zhang, Hui; Yong, Junhai; Paul, Jean Claude

    2010-01-01

    Curves on surfaces play an important role in computer aided geometric design. In this paper, we present a hyperbola approximation method based on the quadratic reparameterization of Bézier surfaces, which generates reasonable low degree curves lying

  14. Reference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate BoilingReference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate Boiling

    Energy Technology Data Exchange (ETDEWEB)

    Pointer, William David [ORNL

    2017-08-01

    The objective of this effort is to establish a strategy and process for generation of suitable computational mesh for computational fluid dynamics simulations of departure from nucleate boiling in a 5 by 5 fuel rod assembly held in place by PWR mixing vane spacer grids. This mesh generation process will support ongoing efforts to develop, demonstrate and validate advanced multi-phase computational fluid dynamics methods that enable more robust identification of dryout conditions and DNB occurrence.Building upon prior efforts and experience, multiple computational meshes were developed using the native mesh generation capabilities of the commercial CFD code STAR-CCM+. These meshes were used to simulate two test cases from the Westinghouse 5 by 5 rod bundle facility. The sensitivity of predicted quantities of interest to the mesh resolution was then established using two evaluation methods, the Grid Convergence Index method and the Least Squares method. This evaluation suggests that the Least Squares method can reliably establish the uncertainty associated with local parameters such as vector velocity components at a point in the domain or surface averaged quantities such as outlet velocity magnitude. However, neither method is suitable for characterization of uncertainty in global extrema such as peak fuel surface temperature, primarily because such parameters are not necessarily associated with a fixed point in space. This shortcoming is significant because the current generation algorithm for identification of DNB event conditions relies on identification of such global extrema. Ongoing efforts to identify DNB based on local surface conditions will address this challenge

  15. The Study of Pallet Pooling Information Platform Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jia-bin Li

    2018-01-01

    Full Text Available Effective implementation of pallet pooling system needs a strong information platform to support. Through the analysis of existing pallet pooling information platform (PPIP, the paper pointed out that the existing studies of PPIP are mainly based on traditional IT infrastructures and technologies which have software, hardware, resource utilization, and process restrictions. Because of the advantages of cloud computing technology like strong computing power, high flexibility, and low cost which meet the requirements of the PPIP well, this paper gave a PPIP architecture of two parts based on cloud computing: the users client and the cloud services. The cloud services include three layers, which are IaaS, PaaS, and SaaS. The method of how to deploy PPIP based on cloud computing is proposed finally.

  16. USING COMPUTER-BASED TESTING AS ALTERNATIVE ASSESSMENT METHOD OF STUDENT LEARNING IN DISTANCE EDUCATION

    Directory of Open Access Journals (Sweden)

    Amalia SAPRIATI

    2010-04-01

    Full Text Available This paper addresses the use of computer-based testing in distance education, based on the experience of Universitas Terbuka (UT, Indonesia. Computer-based testing has been developed at UT for reasons of meeting the specific needs of distance students as the following: Ø students’ inability to sit for the scheduled test, Ø conflicting test schedules, and Ø students’ flexibility to take examination to improve their grades. In 2004, UT initiated a pilot project in the development of system and program for computer-based testing method. Then in 2005 and 2006 tryouts in the use of computer-based testing methods were conducted in 7 Regional Offices that were considered as having sufficient supporting recourses. The results of the tryouts revealed that students were enthusiastic in taking computer-based tests and they expected that the test method would be provided by UT as alternative to the traditional paper and pencil test method. UT then implemented computer-based testing method in 6 and 12 Regional Offices in 2007 and 2008 respectively. The computer-based testing was administered in the city of the designated Regional Office and was supervised by the Regional Office staff. The development of the computer-based testing was initiated with conducting tests using computers in networked configuration. The system has been continually improved, and it currently uses devices linked to the internet or the World Wide Web. The construction of the test involves the generation and selection of the test items from the item bank collection of the UT Examination Center. Thus the combination of the selected items compromises the test specification. Currently UT has offered 250 courses involving the use of computer-based testing. Students expect that more courses are offered with computer-based testing in Regional Offices within easy access by students.

  17. The HEP Software and Computing Knowledge Base

    Science.gov (United States)

    Wenaus, T.

    2017-10-01

    HEP software today is a rich and diverse domain in itself and exists within the mushrooming world of open source software. As HEP software developers and users we can be more productive and effective if our work and our choices are informed by a good knowledge of what others in our community have created or found useful. The HEP Software and Computing Knowledge Base, hepsoftware.org, was created to facilitate this by serving as a collection point and information exchange on software projects and products, services, training, computing facilities, and relating them to the projects, experiments, organizations and science domains that offer them or use them. It was created as a contribution to the HEP Software Foundation, for which a HEP S&C knowledge base was a much requested early deliverable. This contribution will motivate and describe the system, what it offers, its content and contributions both existing and needed, and its implementation (node.js based web service and javascript client app) which has emphasized ease of use for both users and contributors.

  18. [Three dimensional CT reconstruction system on a personal computer].

    Science.gov (United States)

    Watanabe, E; Ide, T; Teramoto, A; Mayanagi, Y

    1991-03-01

    A new computer system to produce three dimensional surface image from CT scan has been invented. Although many similar systems have been already developed and reported, they are too expensive to be set up in routine clinical services because most of these systems are based on high power mini-computer systems. According to the opinion that a practical 3D-CT system should be used in daily clinical activities using only a personal computer, we have transplanted the 3D program into a personal computer working in MS-DOS (16-bit, 12 MHz). We added to the program a routine which simulates surgical dissection on the surface image. The time required to produce the surface image ranges from 40 to 90 seconds. To facilitate the simulation, we connected a 3D system with the neuronavigator. The navigator gives the position of the surgical simulation when the surgeon places the navigator tip on the patient's head thus simulating the surgical excision before the real dissection.

  19. Customizable Computer-Based Interaction Analysis for Coaching and Self-Regulation in Synchronous CSCL Systems

    Science.gov (United States)

    Lonchamp, Jacques

    2010-01-01

    Computer-based interaction analysis (IA) is an automatic process that aims at understanding a computer-mediated activity. In a CSCL system, computer-based IA can provide information directly to learners for self-assessment and regulation and to tutors for coaching support. This article proposes a customizable computer-based IA approach for a…

  20. COMPUTER GRAPHICS MEETS IMAGE FUSION: THE POWER OF TEXTURE BAKING TO SIMULTANEOUSLY VISUALISE 3D SURFACE FEATURES AND COLOUR

    Directory of Open Access Journals (Sweden)

    G. J. Verhoeven

    2017-08-01

    Full Text Available Since a few years, structure-from-motion and multi-view stereo pipelines have become omnipresent in the cultural heritage domain. The fact that such Image-Based Modelling (IBM approaches are capable of providing a photo-realistic texture along the threedimensional (3D digital surface geometry is often considered a unique selling point, certainly for those cases that aim for a visually pleasing result. However, this texture can very often also obscure the underlying geometrical details of the surface, making it very hard to assess the morphological features of the digitised artefact or scene. Instead of constantly switching between the textured and untextured version of the 3D surface model, this paper presents a new method to generate a morphology-enhanced colour texture for the 3D polymesh. The presented approach tries to overcome this switching between objects visualisations by fusing the original colour texture data with a specific depiction of the surface normals. Whether applied to the original 3D surface model or a lowresolution derivative, this newly generated texture does not solely convey the colours in a proper way but also enhances the smalland large-scale spatial and morphological features that are hard or impossible to perceive in the original textured model. In addition, the technique is very useful for low-end 3D viewers, since no additional memory and computing capacity are needed to convey relief details properly. Apart from simple visualisation purposes, the textured 3D models are now also better suited for on-surface interpretative mapping and the generation of line drawings.

  1. Computer Graphics Meets Image Fusion: the Power of Texture Baking to Simultaneously Visualise 3d Surface Features and Colour

    Science.gov (United States)

    Verhoeven, G. J.

    2017-08-01

    Since a few years, structure-from-motion and multi-view stereo pipelines have become omnipresent in the cultural heritage domain. The fact that such Image-Based Modelling (IBM) approaches are capable of providing a photo-realistic texture along the threedimensional (3D) digital surface geometry is often considered a unique selling point, certainly for those cases that aim for a visually pleasing result. However, this texture can very often also obscure the underlying geometrical details of the surface, making it very hard to assess the morphological features of the digitised artefact or scene. Instead of constantly switching between the textured and untextured version of the 3D surface model, this paper presents a new method to generate a morphology-enhanced colour texture for the 3D polymesh. The presented approach tries to overcome this switching between objects visualisations by fusing the original colour texture data with a specific depiction of the surface normals. Whether applied to the original 3D surface model or a lowresolution derivative, this newly generated texture does not solely convey the colours in a proper way but also enhances the smalland large-scale spatial and morphological features that are hard or impossible to perceive in the original textured model. In addition, the technique is very useful for low-end 3D viewers, since no additional memory and computing capacity are needed to convey relief details properly. Apart from simple visualisation purposes, the textured 3D models are now also better suited for on-surface interpretative mapping and the generation of line drawings.

  2. Towards minimal resources of measurement-based quantum computation

    International Nuclear Information System (INIS)

    Perdrix, Simon

    2007-01-01

    We improve the upper bound on the minimal resources required for measurement-only quantum computation (M A Nielsen 2003 Phys. Rev. A 308 96-100; D W Leung 2004 Int. J. Quantum Inform. 2 33; S Perdrix 2005 Int. J. Quantum Inform. 3 219-23). Minimizing the resources required for this model is a key issue for experimental realization of a quantum computer based on projective measurements. This new upper bound also allows one to reply in the negative to the open question presented by Perdrix (2004 Proc. Quantum Communication Measurement and Computing) about the existence of a trade-off between observable and ancillary qubits in measurement-only QC

  3. Developing Educational Computer Animation Based on Human Personality Types

    Science.gov (United States)

    Musa, Sajid; Ziatdinov, Rushan; Sozcu, Omer Faruk; Griffiths, Carol

    2015-01-01

    Computer animation in the past decade has become one of the most noticeable features of technology-based learning environments. By its definition, it refers to simulated motion pictures showing movement of drawn objects, and is often defined as the art in movement. Its educational application known as educational computer animation is considered…

  4. Analysing Test-Takers’ Views on a Computer-Based Speaking Test

    Directory of Open Access Journals (Sweden)

    Marian Amengual-Pizarro

    2017-11-01

    Full Text Available This study examines test-takers’ views on a computer-delivered speaking test in order to investigate the aspects they consider most relevant in technology-based oral assessment, and to explore the main advantages and disadvantages computer-based tests may offer as compared to face-to-face speaking tests. A small-scale open questionnaire was administered to 80 test-takers who took the APTIS speaking test at the Universidad de Alcalá in April 2016. Results reveal that examinees believe computer-based tests provide a valid measure of oral competence in English and are considered to be an adequate method for the assessment of speaking. Interestingly, the data suggest that personal characteristics of test-takers seem to play a key role in deciding upon the most suitable and reliable delivery mode.

  5. An expert fitness diagnosis system based on elastic cloud computing.

    Science.gov (United States)

    Tseng, Kevin C; Wu, Chia-Chuan

    2014-01-01

    This paper presents an expert diagnosis system based on cloud computing. It classifies a user's fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user's physiological data, such as age, gender, and body mass index (BMI). In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8%) and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service.

  6. An Expert Fitness Diagnosis System Based on Elastic Cloud Computing

    Directory of Open Access Journals (Sweden)

    Kevin C. Tseng

    2014-01-01

    Full Text Available This paper presents an expert diagnosis system based on cloud computing. It classifies a user’s fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user’s physiological data, such as age, gender, and body mass index (BMI. In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8% and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service.

  7. Semi-supervised adaptation in ssvep-based brain-computer interface using tri-training

    DEFF Research Database (Denmark)

    Bender, Thomas; Kjaer, Troels W.; Thomsen, Carsten E.

    2013-01-01

    This paper presents a novel and computationally simple tri-training based semi-supervised steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI). It is implemented with autocorrelation-based features and a Naïve-Bayes classifier (NBC). The system uses nine characters...

  8. Introducing Computer-Based Testing in High-Stakes Exams in Higher Education: Results of a Field Experiment.

    Science.gov (United States)

    Boevé, Anja J; Meijer, Rob R; Albers, Casper J; Beetsma, Yta; Bosker, Roel J

    2015-01-01

    The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration.

  9. Security personnel training using a computer-based game

    International Nuclear Information System (INIS)

    Ralph, J.; Bickner, L.

    1987-01-01

    Security personnel training is an integral part of a total physical security program, and is essential in enabling security personnel to perform their function effectively. Several training tools are currently available for use by security supervisors, including: textbook study, classroom instruction, and live simulations. However, due to shortcomings inherent in each of these tools, a need exists for the development of low-cost alternative training methods. This paper discusses one such alternative: a computer-based, game-type security training system. This system would be based on a personal computer with high-resolution graphics. Key features of this system include: a high degree of realism; flexibility in use and maintenance; high trainee motivation; and low cost

  10. Fog computing job scheduling optimization based on bees swarm

    Science.gov (United States)

    Bitam, Salim; Zeadally, Sherali; Mellouk, Abdelhamid

    2018-04-01

    Fog computing is a new computing architecture, composed of a set of near-user edge devices called fog nodes, which collaborate together in order to perform computational services such as running applications, storing an important amount of data, and transmitting messages. Fog computing extends cloud computing by deploying digital resources at the premise of mobile users. In this new paradigm, management and operating functions, such as job scheduling aim at providing high-performance, cost-effective services requested by mobile users and executed by fog nodes. We propose a new bio-inspired optimization approach called Bees Life Algorithm (BLA) aimed at addressing the job scheduling problem in the fog computing environment. Our proposed approach is based on the optimized distribution of a set of tasks among all the fog computing nodes. The objective is to find an optimal tradeoff between CPU execution time and allocated memory required by fog computing services established by mobile users. Our empirical performance evaluation results demonstrate that the proposal outperforms the traditional particle swarm optimization and genetic algorithm in terms of CPU execution time and allocated memory.

  11. Learners’ views about cloud computing-based group activities

    Directory of Open Access Journals (Sweden)

    Yildirim Serkan

    2017-01-01

    Full Text Available Thanks to its use independently of time and place during the process of software development and by making it easier to access to information with mobile technologies, cloud based environments attracted the attention of education world and this technology started to be used in various activities. In this study, for programming education, the effects of extracurricular group assignments in cloud based environments on learners were evaluated in terms of group work satisfaction, ease of use and user satisfaction. Within the scope of computer programming education lasting eight weeks, a total of 100 students participated in the study including 34 men and 66 women. Participants were divided into groups of at least three people considering the advantages of cooperative learning in programming education. In this study carried out in both conventional and cloud based environments, between groups factorial design was used as research design. The data collected by questionnaires of opinions of group work were examined with quantitative analysis method. According to the study results extracurricular learning activities as group activity created satisfaction. However, perceptions of easy use of the environment and user satisfaction were partly positive. Despite the similar understandings; male participants were easier to perceive use of cloud computing based environments. Some variables such as class level, satisfaction, computer and internet usage time do not have any effect on satisfaction and perceptions of ease of use. Evening class students stated that they found it easy to use cloud based learning environments and became more satisfied with using these environments besides being happier with group work than daytime students.

  12. The surface chemistry of metal-oxygen interactions

    DEFF Research Database (Denmark)

    Stokbro, Kurt; Baroni, Stefano

    1997-01-01

    We report on a computational study of the clean and oxygen-covered Rh(110) surface, based on density-functional theory within the local-density approximation. We have used plane-wave basis sets and Vanderbilt ultra-soft pseudopotentials. For the clean surface, we present results for the equilibrium...... structure, surface energy and surface stress of the unreconstructed and (1 x 2) reconstructed structures. For the oxygen-covered surface we have performed a geometry optimization at 0.5, 1, and 2 monolayer oxygen coverages, and we present results for the equilibrium configurations, workfunctions and oxygen...

  13. Influence of APOE Genotype on Hippocampal Atrophy over Time - An N=1925 Surface-Based ADNI Study.

    Directory of Open Access Journals (Sweden)

    Bolun Li

    Full Text Available The apolipoprotein E (APOE e4 genotype is a powerful risk factor for late-onset Alzheimer's disease (AD. In the Alzheimer's Disease Neuroimaging Initiative (ADNI cohort, we previously reported significant baseline structural differences in APOE e4 carriers relative to non-carriers, involving the left hippocampus more than the right--a difference more pronounced in e4 homozygotes than heterozygotes. We now examine the longitudinal effects of APOE genotype on hippocampal morphometry at 6-, 12- and 24-months, in the ADNI cohort. We employed a new automated surface registration system based on conformal geometry and tensor-based morphometry. Among different hippocampal surfaces, we computed high-order correspondences, using a novel inverse-consistent surface-based fluid registration method and multivariate statistics consisting of multivariate tensor-based morphometry (mTBM and radial distance. At each time point, using Hotelling's T(2 test, we found significant morphological deformation in APOE e4 carriers relative to non-carriers in the full cohort as well as in the non-demented (pooled MCI and control subjects at each follow-up interval. In the complete ADNI cohort, we found greater atrophy of the left hippocampus than the right, and this asymmetry was more pronounced in e4 homozygotes than heterozygotes. These findings, combined with our earlier investigations, demonstrate an e4 dose effect on accelerated hippocampal atrophy, and support the enrichment of prevention trial cohorts with e4 carriers.

  14. Commentary on: "Toward Computer-Based Support of Metacognitive Skills: A Computational Framework to Coach Self Explanation"

    Science.gov (United States)

    Conati, Cristina

    2016-01-01

    This paper is a commentary on "Toward Computer-Based Support of Meta-Cognitive Skills: a Computational Framework to Coach Self-Explanation", by Cristina Conati and Kurt Vanlehn, published in the "IJAED" in 2000 (Conati and VanLehn 2010). This work was one of the first examples of Intelligent Learning Environments (ILE) that…

  15. Theoretical and Experimental Analysis of Adsorption in Surface-based Biosensors

    DEFF Research Database (Denmark)

    Hansen, Rasmus

    The present Ph.D. dissertation concerns the application of surface plasmon resonance (SPR) spectroscopy, which is a surface-based biosensor technology, for studies of adsorption dynamics. The thesis contains both experimental and theoretical work. In the theoretical part we develop the theory...... cell of the surface-based biosensor, in addition to the sensor surface, is investigated. In the experimental part of the thesis we use a Biacore SPR sensor to study lipase adsorption on model substrate surfaces, as well as competitive adsorption of lipase and surfactants. A part of the experimental...

  16. A rule-based computer control system for PBX-M neutral beams

    International Nuclear Information System (INIS)

    Frank, K.T.; Kozub, T.A.; Kugel, H.W.

    1987-01-01

    The Princeton Beta Experiment (PBX) neutral beams have been routinely operated under automatic computer control. A major upgrade of the computer configuration was undertaken to coincide with the PBX machine modification. The primary tasks included in the computer control system are data acquisition, waveform reduction, automatic control and data storage. The portion of the system which will remain intact is the rule-based approach to automatic control. Increased computational and storage capability will allow the expansion of the knowledge base previously used. The hardware configuration supported by the PBX Neutral Beam (XNB) software includes a dedicated Microvax with five CAMAC crates and four process controllers. The control algorithms are rule-based and goal-driven. The automatic control system raises ion source electrical parameters to selected energy goals and maintains these levels until new goals are requested or faults are detected

  17. Surface enhanced raman spectroscopy on chip

    DEFF Research Database (Denmark)

    Hübner, Jörg; Anhøj, Thomas Aarøe; Zauner, Dan

    2007-01-01

    In this paper we report low resolution surface enhanced Raman spectra (SERS) conducted with a chip based spectrometer. The flat field spectrometer presented here is fabricated in SU-8 on silicon, showing a resolution of around 3 nm and a free spectral range of around 100 nm. The output facet...... is projected onto a CCD element and visualized by a computer. To enhance the otherwise rather weak Raman signal, a nanosurface is prepared and a sample solutions is impregnated on this surface. The surface enhanced Raman signal is picked up using a Raman probe and coupled into the spectrometer via an optical...... fiber. The obtained spectra show that chip based spectrometer together with the SERS active surface can be used as Raman sensor....

  18. Acid-base characteristics of powdered-activated-carbon surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Reed, B.E. (West Virginia Univ., Morgantown (United States)); Jensen, J.N.; Matsumoto, M.R. (State Univ. of New York, Buffalo (United States))

    Adsorption of heavy metals onto activated carbon has been described using the surface-complex-formation (SCF) model, a chemical equilibrium model. The SCF model requires a knowledge of the amphoteric nature of activated carbon prior to metal adsorption modeling. In the past, a single-diprotic-acid-site model had been employed to describe the amphoteric nature of activated-carbon surfaces. During this study, the amphoteric nature of two powdered activated carbons were investigated, and a three-monoprotic site surface model was found to be a plausible alternative. The single-diprotic-acid-site and two-monoprotic-site models did not describe the acid-base behavior of the two carbons studied adequately. The two-diprotic site was acceptable for only one of the study carbons. The acid-base behavior of activated carbon surfaces seem to be best modeled as a series of weak monoprotic acids.

  19. Computation of Difference Grobner Bases

    Directory of Open Access Journals (Sweden)

    Vladimir P. Gerdt

    2012-07-01

    Full Text Available This paper is an updated and extended version of our note \\cite{GR'06} (cf.\\ also \\cite{GR-ACAT}. To compute difference \\Gr bases of ideals generated by linear polynomials we adopt to difference polynomial rings the involutive algorithm based on Janet-like division. The algorithm has been implemented in Maple in the form of the package LDA (Linear Difference Algebra and we describe the main features of the package. Its applications are illustrated by generation of finite difference approximations to linear partial differential equations and by reduction of Feynman integrals. We also present the algorithm for an ideal generated by a finite set of nonlinear difference polynomials. If the algorithm terminates, then it constructs a \\Gr basis of the ideal.

  20. Computer-Aided Test Flow in Core-Based Design

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the test-pattern generation and fault coverage determination in the core based design. The basic core-test strategy that one has to apply in the core-based design is stated in this work. A Computer-Aided Test (CAT) flow is proposed resulting in accurate fault coverage of

  1. Computer-based interventions for drug use disorders: A systematic review

    Science.gov (United States)

    Moore, Brent A.; Fazzino, Tera; Garnet, Brian; Cutter, Christopher J.; Barry, Declan T.

    2011-01-01

    A range of innovative computer-based interventions for psychiatric disorders have been developed, and are promising for drug use disorders, due to reduced cost and greater availability compared to traditional treatment. Electronic searches were conducted from 1966 to November 19, 2009 using MEDLINE, Psychlit, and EMBASE. 468 non-duplicate records were identified. Two reviewers classified abstracts for study inclusion, resulting in 12 studies of moderate quality. Eleven studies were pilot or full-scale trials compared to a control condition. Interventions showed high acceptability despite substantial variation in type and amount of treatment. Compared to treatment-as-usual, computer-based interventions led to less substance use as well as higher motivation to change, better retention, and greater knowledge of presented information. Computer-based interventions for drug use disorders have the potential to dramatically expand and alter the landscape of treatment. Evaluation of internet and phone-based delivery that allow for treatment-on-demand in patients’ own environment is needed. PMID:21185683

  2. Study of cutting speed on surface roughness and chip formation when machining nickel-based alloy

    International Nuclear Information System (INIS)

    Khidhir, Basim A.; Mohamed, Bashir

    2010-01-01

    Nickel- based alloy is difficult-to-machine because of its low thermal diffusive property and high strength at higher temperature. The machinability of nickel- based Hastelloy C-276 in turning operations has been carried out using different types of inserts under dry conditions on a computer numerical control (CNC) turning machine at different stages of cutting speed. The effects of cutting speed on surface roughness have been investigated. This study explores the types of wear caused by the effect of cutting speed on coated and uncoated carbide inserts. In addition, the effect of burr formation is investigated. The chip burr is found to have different shapes at lower speeds. Triangles and squares have been noticed for both coated and uncoated tips as well. The conclusion from this study is that the transition from thick continuous chip to wider discontinuous chip is caused by different types of inserts. The chip burr has a significant effect on tool damage starting in the line of depth-of-cut. For the coated insert tips, the burr disappears when the speed increases to above 150 m/min with the improvement of surface roughness; increasing the speed above the same limit for uncoated insert tips increases the chip burr size. The results of this study showed that the surface finish of nickel-based alloy is highly affected by the insert type with respect to cutting speed changes and its effect on chip burr formation and tool failure

  3. Perturbative computation of string one-loop corrections to Wilson loop minimal surfaces in AdS{sub 5}×S{sup 5}

    Energy Technology Data Exchange (ETDEWEB)

    Forini, V. [Institut für Physik, Humboldt-Universität zu Berlin, IRIS Adlershof,Zum Großen Windkanal 6, 12489 Berlin (Germany); Tseytlin, A.A. [Theoretical Physics Group, Blackett Laboratory, Imperial College,London, SW7 2AZ (United Kingdom); Vescovi, E. [Institut für Physik, Humboldt-Universität zu Berlin, IRIS Adlershof,Zum Großen Windkanal 6, 12489 Berlin (Germany); Institute of Physics, University of São Paulo,Rua do Matão 1371, 05508-090 São Paulo (Brazil)

    2017-03-01

    We revisit the computation of the 1-loop string correction to the “latitude' minimal surface in AdS{sub 5}×S{sup 5} representing 1/4 BPS Wilson loop in planar N=4 SYM theory previously addressed in https://arxiv.org/abs/1512.00841 and https://arxiv.org/abs/1601.04708. We resolve the problem of matching with the subleading term in the strong coupling expansion of the exact gauge theory result (derived previously from localization) using a different method to compute determinants of 2d string fluctuation operators. We apply perturbation theory in a small parameter (angle of the latitude) corresponding to an expansion near the AdS{sub 2} minimal surface representing 1/2 BPS circular Wilson loop. This allows us to compute the corrections to the heat kernels and zeta-functions of the operators in terms of the known heat kernels on AdS{sub 2}. We apply the same method also to two other examples of Wilson loop surfaces: generalized cusp and k-wound circle.

  4. Individual versus Interactive Task-Based Performance through Voice-Based Computer-Mediated Communication

    Science.gov (United States)

    Granena, Gisela

    2016-01-01

    Interaction is a necessary condition for second language (L2) learning (Long, 1980, 1996). Research in computer-mediated communication has shown that interaction opportunities make learners pay attention to form in a variety of ways that promote L2 learning. This research has mostly investigated text-based rather than voice-based interaction. The…

  5. Computer- and Suggestion-based Cognitive Rehabilitation following Acquired Brain Injury

    DEFF Research Database (Denmark)

    Lindeløv, Jonas Kristoffer

    . That is, training does not cause cognitive transfer and thus does not constitute “brain training” or “brain exercise” of any clinical relevance. A larger study found more promising results for a suggestion-based treatment in a hypnotic procedure. Patients improved to above population average in a matter...... of 4-8 hours, making this by far the most effective treatment compared to computer-based training, physical exercise, phamaceuticals, meditation, and attention process training. The contrast between computer-based methods and the hypnotic suggestion treatment may be reflect a more general discrepancy...

  6. Introducing Computer-Based Testing in High-Stakes Exams in Higher Education: Results of a Field Experiment

    Science.gov (United States)

    Boevé, Anja J.; Meijer, Rob R.; Albers, Casper J.; Beetsma, Yta; Bosker, Roel J.

    2015-01-01

    The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration. PMID:26641632

  7. Optimal Sequential Rules for Computer-Based Instruction.

    Science.gov (United States)

    Vos, Hans J.

    1998-01-01

    Formulates sequential rules for adapting the appropriate amount of instruction to learning needs in the context of computer-based instruction. Topics include Bayesian decision theory, threshold and linear-utility structure, psychometric model, optimal sequential number of test questions, and an empirical example of sequential instructional…

  8. Comparative evaluation of tensile bond strength of silicone-based denture liners after thermocycling and surface treatment.

    Science.gov (United States)

    Kaur, Harsimran; Datta, Kusum

    2015-01-01

    To examine, evaluate, and compare the tensile bond strength of two silicone-based liners; one autopolymerizing and one heat cured, when treated with different chemical etchants to improve their adhesion with denture base resin. Hundred and sixty test specimens of heat-cured polymethyl methacrylate (PMMA) were fabricated; out of which 80 specimens were tested for tensile bond strength after bonding it to autopolymerizing resilient liner (Ufigel P) and rest 80 to heat-cured resilient liner (Molloplast B). Each main group was further divided into four subgroups of 20 specimens each, one to act as a control and three were subjected to surface treatment with different chemical etchants namely dichloromethane, MMA monomer, and chloroform. The two silicone-based denture liners were processed between 2 PMMA specimens (10 mm × 10 mm × 40 mm) in the space provided by a spacer of 3 mm, thermocycled (5-55°C) for 500 cycles, and then their tensile strength measurements were done in the universal testing machine. One-way ANOVA technique showed a highly significant difference in the mean tensile bond strength values for all the groups. The Student's t-test computed values of statistics for the compared groups were greater than the critical values both at 5% and at 1% levels. Surface treatment of denture base resin with chemical etchants prior to the application of silicone-based liner (Ufigel P and Molloplast-B) increased the tensile bond strength. The increase was the highest with specimens subjected to 180 s of MMA surface treatment and the lowest with control group specimens.

  9. OpenCL-based vicinity computation for 3D multiresolution mesh compression

    Science.gov (United States)

    Hachicha, Soumaya; Elkefi, Akram; Ben Amar, Chokri

    2017-03-01

    3D multiresolution mesh compression systems are still widely addressed in many domains. These systems are more and more requiring volumetric data to be processed in real-time. Therefore, the performance is becoming constrained by material resources usage and an overall reduction in the computational time. In this paper, our contribution entirely lies on computing, in real-time, triangles neighborhood of 3D progressive meshes for a robust compression algorithm based on the scan-based wavelet transform(WT) technique. The originality of this latter algorithm is to compute the WT with minimum memory usage by processing data as they are acquired. However, with large data, this technique is considered poor in term of computational complexity. For that, this work exploits the GPU to accelerate the computation using OpenCL as a heterogeneous programming language. Experiments demonstrate that, aside from the portability across various platforms and the flexibility guaranteed by the OpenCL-based implementation, this method can improve performance gain in speedup factor of 5 compared to the sequential CPU implementation.

  10. Microstructural evolution and surface properties of nanostructured Cu-based alloy by ultrasonic nanocrystalline surface modification technique

    Energy Technology Data Exchange (ETDEWEB)

    Amanov, Auezhan, E-mail: amanov_a@yahoo.com [Department of Mechanical Engineering, Sun Moon University, Asan 336-708 (Korea, Republic of); Cho, In-Sik [R& D Group, Mbrosia Co., Ltd., Asan 336-708 (Korea, Republic of); Pyun, Young-Sik [Department of Mechanical Engineering, Sun Moon University, Asan 336-708 (Korea, Republic of)

    2016-12-01

    Graphical abstract: - Highlights: • A nanostructured surface was produced by UNSM technique. • Porosities were eliminated from the surface by UNSM technique. • Extremely high hardness obtained at the top surface after UNSM treatment. • Friction and wear behavior was improved by UNSM technique. • Resistance to scratch behavior was improved by UNSM technique. - Abstract: A nanostructured surface layer with a thickness of about 180 μm was successfully produced in Cu-based alloy using an ultrasonic nanocrystalline surface modification (UNSM) technique. Cu-based alloy was sintered onto low carbon steel using a powder metallurgy (P/M) method. Transmission electron microscope (TEM) characterization revealed that the severe plastic deformation introduced by UNSM technique resulted in nano-sized grains in the topmost surface layer and deformation twins. It was also found by atomic force microscope (AFM) observations that the UNSM technique provides a significant reduction in number of interconnected pores. The effectiveness of nanostructured surface layer on the tribological and micro-scratch properties of Cu-based alloy specimens was investigated using a ball-on-disk tribometer and micro-scratch tester, respectively. Results exhibited that the UNSM-treated specimen led to an improvement in tribological and micro-scratch properties compared to that of the sintered specimen, which may be attributed to the presence of nanostructured surface layer having an increase in surface hardness and reduction in surface roughness. The findings from this study are expected to be implemented to the automotive industry, in particular connected rod bearings and bushings in order to increase the efficiency and performance of internal combustion engines (ICEs).

  11. Computer-Aided Test Flow in Core-Based Design

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the efficient test-pattern generation in a core-based design. A consistent Computer-Aided Test (CAT) flow is proposed based on the required core-test strategy. It generates a test-pattern set for the embedded cores with high fault coverage and low DfT area overhead. The CAT

  12. Computer-Based Simulations for Maintenance Training: Current ARI Research. Technical Report 544.

    Science.gov (United States)

    Knerr, Bruce W.; And Others

    Three research efforts that used computer-based simulations for maintenance training were in progress when this report was written: Game-Based Learning, which investigated the use of computer-based games to train electronics diagnostic skills; Human Performance in Fault Diagnosis Tasks, which evaluated the use of context-free tasks to train…

  13. A Quantitative Exploration of Preservice Teachers' Intent to Use Computer-based Technology

    Science.gov (United States)

    Kim, Kioh; Jain, Sachin; Westhoff, Guy; Rezabek, Landra

    2008-01-01

    Based on Bandura's (1977) social learning theory, the purpose of this study is to identify the relationship of preservice teachers' perceptions of faculty modeling of computer-based technology and preservice teachers' intent of using computer-based technology in educational settings. There were 92 participants in this study; they were enrolled in…

  14. Providing Feedback on Computer-Based Algebra Homework in Middle-School Classrooms

    Science.gov (United States)

    Fyfe, Emily R.

    2016-01-01

    Homework is transforming at a rapid rate with continuous advances in educational technology. Computer-based homework, in particular, is gaining popularity across a range of schools, with little empirical evidence on how to optimize student learning. The current aim was to test the effects of different types of feedback on computer-based homework.…

  15. GPU-based high-performance computing for radiation therapy

    International Nuclear Information System (INIS)

    Jia, Xun; Jiang, Steve B; Ziegenhein, Peter

    2014-01-01

    Recent developments in radiotherapy therapy demand high computation powers to solve challenging problems in a timely fashion in a clinical environment. The graphics processing unit (GPU), as an emerging high-performance computing platform, has been introduced to radiotherapy. It is particularly attractive due to its high computational power, small size, and low cost for facility deployment and maintenance. Over the past few years, GPU-based high-performance computing in radiotherapy has experienced rapid developments. A tremendous amount of study has been conducted, in which large acceleration factors compared with the conventional CPU platform have been observed. In this paper, we will first give a brief introduction to the GPU hardware structure and programming model. We will then review the current applications of GPU in major imaging-related and therapy-related problems encountered in radiotherapy. A comparison of GPU with other platforms will also be presented. (topical review)

  16. Computing Visible-Surface Representations,

    Science.gov (United States)

    1985-03-01

    Terzopoulos N00014-75-C-0643 9. PERFORMING ORGANIZATION NAME AMC ADDRESS 10. PROGRAM ELEMENT. PROJECT, TASK Artificial Inteligence Laboratory AREA A...Massachusetts Institute of lechnolog,. Support lbr the laboratory’s Artificial Intelligence research is provided in part by the Advanced Rtccarcl Proj...dynamically maintaining visible surface representations. Whether the intention is to model human vision or to design competent artificial vision systems

  17. On the computation of the turbulent flow near rough surface

    Science.gov (United States)

    Matveev, S. K.; Jaychibekov, N. Zh.; Shalabayeva, B. S.

    2018-05-01

    One of the problems in constructing mathematical models of turbulence is a description of the flows near a rough surface. An experimental study of such flows is also difficult because of the impossibility of measuring "inside" the roughness. The theoretical calculation is difficult because of the lack of equations describing the flow in this zone. In this paper, a new turbulence model based on the differential equation of turbulent viscosity balance was used to describe a turbulent flow near a rough surface. The difference between the new turbulence model and the previously known consists in the choice of constants and functions that determine the generation, dissipation and diffusion of viscosity.

  18. Computing derivative-based global sensitivity measures using polynomial chaos expansions

    International Nuclear Information System (INIS)

    Sudret, B.; Mai, C.V.

    2015-01-01

    In the field of computer experiments sensitivity analysis aims at quantifying the relative importance of each input parameter (or combinations thereof) of a computational model with respect to the model output uncertainty. Variance decomposition methods leading to the well-known Sobol' indices are recognized as accurate techniques, at a rather high computational cost though. The use of polynomial chaos expansions (PCE) to compute Sobol' indices has allowed to alleviate the computational burden though. However, when dealing with large dimensional input vectors, it is good practice to first use screening methods in order to discard unimportant variables. The derivative-based global sensitivity measures (DGSMs) have been developed recently in this respect. In this paper we show how polynomial chaos expansions may be used to compute analytically DGSMs as a mere post-processing. This requires the analytical derivation of derivatives of the orthonormal polynomials which enter PC expansions. Closed-form expressions for Hermite, Legendre and Laguerre polynomial expansions are given. The efficiency of the approach is illustrated on two well-known benchmark problems in sensitivity analysis. - Highlights: • Derivative-based global sensitivity measures (DGSM) have been developed for screening purpose. • Polynomial chaos expansions (PC) are used as a surrogate model of the original computational model. • From a PC expansion the DGSM can be computed analytically. • The paper provides the derivatives of Hermite, Legendre and Laguerre polynomials for this purpose

  19. Cross Validation Through Two-Dimensional Solution Surface for Cost-Sensitive SVM.

    Science.gov (United States)

    Gu, Bin; Sheng, Victor S; Tay, Keng Yeow; Romano, Walter; Li, Shuo

    2017-06-01

    Model selection plays an important role in cost-sensitive SVM (CS-SVM). It has been proven that the global minimum cross validation (CV) error can be efficiently computed based on the solution path for one parameter learning problems. However, it is a challenge to obtain the global minimum CV error for CS-SVM based on one-dimensional solution path and traditional grid search, because CS-SVM is with two regularization parameters. In this paper, we propose a solution and error surfaces based CV approach (CV-SES). More specifically, we first compute a two-dimensional solution surface for CS-SVM based on a bi-parameter space partition algorithm, which can fit solutions of CS-SVM for all values of both regularization parameters. Then, we compute a two-dimensional validation error surface for each CV fold, which can fit validation errors of CS-SVM for all values of both regularization parameters. Finally, we obtain the CV error surface by superposing K validation error surfaces, which can find the global minimum CV error of CS-SVM. Experiments are conducted on seven datasets for cost sensitive learning and on four datasets for imbalanced learning. Experimental results not only show that our proposed CV-SES has a better generalization ability than CS-SVM with various hybrids between grid search and solution path methods, and than recent proposed cost-sensitive hinge loss SVM with three-dimensional grid search, but also show that CV-SES uses less running time.

  20. Computer vision system in real-time for color determination on flat surface food

    Directory of Open Access Journals (Sweden)

    Erick Saldaña

    2013-03-01

    Full Text Available Artificial vision systems also known as computer vision are potent quality inspection tools, which can be applied in pattern recognition for fruits and vegetables analysis. The aim of this research was to design, implement and calibrate a new computer vision system (CVS in real-time for the color measurement on flat surface food. For this purpose was designed and implemented a device capable of performing this task (software and hardware, which consisted of two phases: a image acquisition and b image processing and analysis. Both the algorithm and the graphical interface (GUI were developed in Matlab. The CVS calibration was performed using a conventional colorimeter (Model CIEL* a* b*, where were estimated the errors of the color parameters: eL* = 5.001%, and ea* = 2.287%, and eb* = 4.314 % which ensure adequate and efficient automation application in industrial processes in the quality control in the food industry sector.

  1. Computer vision system in real-time for color determination on flat surface food

    Directory of Open Access Journals (Sweden)

    Erick Saldaña

    2013-01-01

    Full Text Available Artificial vision systems also known as computer vision are potent quality inspection tools, which can be applied in pattern recognition for fruits and vegetables analysis. The aim of this research was to design, implement and calibrate a new computer vision system (CVS in real - time f or the color measurement on flat surface food. For this purpose was designed and implemented a device capable of performing this task (software and hardware, which consisted of two phases: a image acquisition and b image processing and analysis. Both th e algorithm and the graphical interface (GUI were developed in Matlab. The CVS calibration was performed using a conventional colorimeter (Model CIEL* a* b*, where were estimated the errors of the color parameters: e L* = 5.001%, and e a* = 2.287%, and e b* = 4.314 % which ensure adequate and efficient automation application in industrial processes in the quality control in the food industry sector.

  2. Computer-based systems for nuclear power stations

    International Nuclear Information System (INIS)

    Humble, P.J.; Welbourne, D.; Belcher, G.

    1995-01-01

    The published intentions of vendors are for extensive touch-screen control and computer-based protection. The software features needed for acceptance in the UK are indicated. The defence in depth needed is analyzed. Current practice in aircraft flight control systems and the software methods available are discussed. Software partitioning and mathematically formal methods are appropriate for the structures and simple logic needed for nuclear power applications. The potential for claims of diversity and independence between two computer-based subsystems of a protection system is discussed. Features needed to meet a single failure criterion applied to software are discussed. Conclusions are given on the main factors which a design should allow for. The work reported was done for the Health and Safety Executive of the UK (HSE), and acknowledgement is given to them, to NNC Ltd and to GEC-Marconi Avionics Ltd for permission to publish. The opinions and recommendations expressed are those of the authors and do not necessarily reflect those of HSE. (Author)

  3. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    Science.gov (United States)

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  4. Problems and Issues in Using Computer- Based Support Tools to Enhance 'Soft' Systems Methodologies

    Directory of Open Access Journals (Sweden)

    Mark Stansfield

    2001-11-01

    Full Text Available This paper explores the issue of whether computer-based support tools can enhance the use of 'soft' systems methodologies as applied to real-world problem situations. Although work has been carried out by a number of researchers in applying computer-based technology to concepts and methodologies relating to 'soft' systems thinking such as Soft Systems Methodology (SSM, such attempts appear to be still in their infancy and have not been applied widely to real-world problem situations. This paper will highlight some of the problems that may be encountered in attempting to develop computer-based support tools for 'soft' systems methodologies. Particular attention will be paid to an attempt by the author to develop a computer-based support tool for a particular 'soft' systems method of inquiry known as the Appreciative Inquiry Method that is based upon Vickers' notion of 'appreciation' (Vickers, 196S and Checkland's SSM (Checkland, 1981. The final part of the paper will explore some of the lessons learnt from developing and applying the computer-based support tool to a real world problem situation, as well as considering the feasibility of developing computer-based support tools for 'soft' systems methodologies. This paper will put forward the point that a mixture of manual and computer-based tools should be employed to allow a methodology to be used in an unconstrained manner, but the benefits provided by computer-based technology should be utilised in supporting and enhancing the more mundane and structured tasks.

  5. Three Dimensional Digital Sieving of Asphalt Mixture Based on X-ray Computed Tomography

    Directory of Open Access Journals (Sweden)

    Chichun Hu

    2017-07-01

    Full Text Available In order to perform three-dimensional digital sieving based on X-ray computed tomography images, the definition of digital sieve size (DSS was proposed, which was defined as the minimum length of the minimum bounding squares of all possible orthographic projections of an aggregate. The corresponding program was developed to reconstruct aggregate structure and to obtain DSS. Laboratory experiments consisting of epoxy-filled aggregate specimens were conducted to investigate the difference between mechanical sieve analysis and the digital sieving technique. It was suggested that concave surface of aggregate was the possible reason for the disparity between DSS and mechanical sieve size. A comparison between DSS and equivalent diameter was also performed. Moreover, the digital sieving technique was adopted to evaluate the gradation of stone mastic asphalt mixtures. The results showed that the closest proximity of the laboratory gradation curve was achieved by calibrated DSS, among gradation curves based on calibrated DSS, un-calibrated DSS and equivalent diameter.

  6. Solid-State Quantum Computer Based on Scanning Tunneling Microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Berman, G. P.; Brown, G. W.; Hawley, M. E.; Tsifrinovich, V. I.

    2001-08-27

    We propose a solid-state nuclear-spin quantum computer based on application of scanning tunneling microscopy (STM) and well-developed silicon technology. It requires the measurement of tunneling-current modulation caused by the Larmor precession of a single electron spin. Our envisioned STM quantum computer would operate at the high magnetic field ({approx}10 T) and at low temperature {approx}1 K .

  7. Solid-State Quantum Computer Based on Scanning Tunneling Microscopy

    International Nuclear Information System (INIS)

    Berman, G. P.; Brown, G. W.; Hawley, M. E.; Tsifrinovich, V. I.

    2001-01-01

    We propose a solid-state nuclear-spin quantum computer based on application of scanning tunneling microscopy (STM) and well-developed silicon technology. It requires the measurement of tunneling-current modulation caused by the Larmor precession of a single electron spin. Our envisioned STM quantum computer would operate at the high magnetic field (∼10 T) and at low temperature ∼1 K

  8. Nanoparticle-Based Surface Modifications for Microtribology Control and Superhydrophobicity

    Science.gov (United States)

    Hurst, Kendall Matthew

    2010-11-01

    The emergence of miniaturization techniques for consumer electronics has brought forth the relatively new and exciting field of microelectromechanical systems (MEMS). However, due to the inherent forces that exist between surfaces at the micro- and nanoscale, scientists and semiconductor manufacturers are still struggling to improve the lifetime and reliability of complex microdevices. Due to the extremely large surface area-to-volume ratio of typical MEMS and microstructured surfaces, dominant interfacial forces exist which can be detrimental to their operational lifetime. In particular, van der Waals, capillary, and electrostatic forces contribute to the permanent adhesion, or stiction , of microfabricated surfaces. This strong adhesion force also contributes to the friction and wear of these silicon-based systems. The scope of this work was to examine the effect of utilizing nanoparticles as the basis for roughening surfaces for the purpose of creating films with anti-adhesive and/or superhydrophobic properties. All of the studies presented in this work are focused around a gas-expanded liquid (GXL) process that promotes the deposition of colloidal gold nanoparticles (AuNPs) into conformal thin films. The GXL particle deposition process is finalized by a critical point drying step which is advantageous to the microelectromechanical systems and semiconductor (IC) industries. In fact, preliminary results illustrated that the GXL particle deposition process can easily be integrated into current MEMS microfabrication processes. Thin films of AuNPs deposited onto the surfaces of silicon-based MEMS and tribology test devices were shown to have a dramatic effect on the adhesion of microstructures. In the various investigations, the apparent work of adhesion between surfaces was reduced by 2-4 orders of magnitude. This effect is greatly attributed to the roughening of the typically smooth silicon oxide surfaces which, in turn, dramatically decreases the "real are of

  9. Indirect versus direct feedback in computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    2010-01-01

    Prism Adaptation Therapy (PAT) is an intervention method in the treatment of the attention disorder neglect (Frassinetti, Angeli, Meneghello, Avanzi, & Ladavas, 2002; Rossetti, et al., 1998). The aim of this study was to investigate whether one session of PAT using a computer-attached touchscreen...... in the aftereffect. The findings have direct implications for future implementations of computer-based methods of treatment of visuospatial disorders and computer-assisted rehabilitation in general....

  10. Surface characterization of collagen/elastin based biomaterials for tissue regeneration

    International Nuclear Information System (INIS)

    Skopinska-Wisniewska, J.; Sionkowska, A.; Kaminska, A.; Kaznica, A.; Jachimiak, R.; Drewa, T.

    2009-01-01

    Collagen and elastin are the main proteins of extracellular matrix. Collagen plays a crucial role in tensile strength of tissues, whereas elastin provides resilience to many organs. Both biopolymers are readily available and biocompatible. These properties point out that collagen and elastin are good components of materials for many potential medical applications. The surface properties of biomaterials play an important role in biomedicine as the majority of biological reactions occur on the surface of implanted materials. One of the methods of surface modification is UV-irradiation. The exposition of the biomaterial on ultraviolet light can alterate surface properties of the materials, their chemical stability, swelling properties and mechanical properties as well. The aim of our work was to study the surface properties and biocompatibility of new collagen/elastin based biomaterials and consideration of the influence of ultraviolet light on these properties. The surface properties of collagen/elastin based biomaterials modified by UV-irradiation were studied using the technique of atomic force microscopy (AFM) and contact angle measurements. On the basis of the results the surface free energy and its polar component was calculated using Owens-Wendt method. To assess the biological performance of films based on collagen, elastin and their blends, the response of 3T3 cell was investigated. It was found that the surface of collagen/elastin film is enriched in less polar component - collagen. Exposition on UV light increases polarity of collagen/elastin based films, due to photooxidation process. The AFM images have shown that topography and roughness of the materials had been also affected by UV-irradiation. The changes in surface properties influence on interaction between the material's surface and cells. The investigation of 3T3 cells grown on films based on collagen, elastin and their blends, leads to the conclusion that higher content of elastin in biomaterial

  11. Surface characterization of collagen/elastin based biomaterials for tissue regeneration

    Energy Technology Data Exchange (ETDEWEB)

    Skopinska-Wisniewska, J., E-mail: joanna@chem.uni.torun.pl [Faculty of Chemistry, Nicolaus Copernicus University, Gagarin 7, 87-100 Torun (Poland); Sionkowska, A.; Kaminska, A. [Faculty of Chemistry, Nicolaus Copernicus University, Gagarin 7, 87-100 Torun (Poland); Kaznica, A.; Jachimiak, R.; Drewa, T. [Collegium Medicum, Nicolaus Copernicus University, Karlowicz 24, 85-092 Bydgoszcz (Poland)

    2009-07-15

    Collagen and elastin are the main proteins of extracellular matrix. Collagen plays a crucial role in tensile strength of tissues, whereas elastin provides resilience to many organs. Both biopolymers are readily available and biocompatible. These properties point out that collagen and elastin are good components of materials for many potential medical applications. The surface properties of biomaterials play an important role in biomedicine as the majority of biological reactions occur on the surface of implanted materials. One of the methods of surface modification is UV-irradiation. The exposition of the biomaterial on ultraviolet light can alterate surface properties of the materials, their chemical stability, swelling properties and mechanical properties as well. The aim of our work was to study the surface properties and biocompatibility of new collagen/elastin based biomaterials and consideration of the influence of ultraviolet light on these properties. The surface properties of collagen/elastin based biomaterials modified by UV-irradiation were studied using the technique of atomic force microscopy (AFM) and contact angle measurements. On the basis of the results the surface free energy and its polar component was calculated using Owens-Wendt method. To assess the biological performance of films based on collagen, elastin and their blends, the response of 3T3 cell was investigated. It was found that the surface of collagen/elastin film is enriched in less polar component - collagen. Exposition on UV light increases polarity of collagen/elastin based films, due to photooxidation process. The AFM images have shown that topography and roughness of the materials had been also affected by UV-irradiation. The changes in surface properties influence on interaction between the material's surface and cells. The investigation of 3T3 cells grown on films based on collagen, elastin and their blends, leads to the conclusion that higher content of elastin in

  12. Response mechanism for surface acoustic wave gas sensors based on surface-adsorption.

    Science.gov (United States)

    Liu, Jiansheng; Lu, Yanyan

    2014-04-16

    A theoretical model is established to describe the response mechanism of surface acoustic wave (SAW) gas sensors based on physical adsorption on the detector surface. Wohljent's method is utilized to describe the relationship of sensor output (frequency shift of SAW oscillator) and the mass loaded on the detector surface. The Brunauer-Emmett-Teller (BET) formula and its improved form are introduced to depict the adsorption behavior of gas on the detector surface. By combining the two methods, we obtain a theoretical model for the response mechanism of SAW gas sensors. By using a commercial SAW gas chromatography (GC) analyzer, an experiment is performed to measure the frequency shifts caused by different concentration of dimethyl methylphosphonate (DMMP). The parameters in the model are given by fitting the experimental results and the theoretical curve agrees well with the experimental data.

  13. An Integrated Software Suite for Surface-based Analyses of Cerebral Cortex

    Science.gov (United States)

    Van Essen, David C.; Drury, Heather A.; Dickson, James; Harwell, John; Hanlon, Donna; Anderson, Charles H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database. PMID:11522765

  14. An integrated software suite for surface-based analyses of cerebral cortex

    Science.gov (United States)

    Van Essen, D. C.; Drury, H. A.; Dickson, J.; Harwell, J.; Hanlon, D.; Anderson, C. H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database.

  15. Toward prethreshold gate-based quantum simulation of chemical dynamics: using potential energy surfaces to simulate few-channel molecular collisions

    Science.gov (United States)

    Sornborger, Andrew T.; Stancil, Phillip; Geller, Michael R.

    2018-05-01

    One of the most promising applications of an error-corrected universal quantum computer is the efficient simulation of complex quantum systems such as large molecular systems. In this application, one is interested in both the electronic structure such as the ground state energy and dynamical properties such as the scattering cross section and chemical reaction rates. However, most theoretical work and experimental demonstrations have focused on the quantum computation of energies and energy surfaces. In this work, we attempt to make the prethreshold (not error-corrected) quantum simulation of dynamical properties practical as well. We show that the use of precomputed potential energy surfaces and couplings enables the gate-based simulation of few-channel but otherwise realistic molecular collisions. Our approach is based on the widely used Born-Oppenheimer approximation for the structure problem coupled with a semiclassical method for the dynamics. In the latter the electrons are treated quantum mechanically but the nuclei are classical, which restricts the collisions to high energy or temperature (typically above ≈ 10 eV). By using operator splitting techniques optimized for the resulting time-dependent Hamiltonian simulation problem, we give several physically realistic collision examples, with 3-8 channels and circuit depths < 1000.

  16. Computer simulation of the relationship between selected properties of laser remelted tool steel surface layer

    Energy Technology Data Exchange (ETDEWEB)

    Bonek, Mirosław, E-mail: miroslaw.bonek@polsl.pl; Śliwa, Agata; Mikuła, Jarosław

    2016-12-01

    Highlights: • Prediction of the properties of laser remelted surface layer with the use of FEM analysis. • The simulation was applied to determine the shape of molten pool of remelted surface. • Applying of numerical model MES for simulation of surface laser treatment to meaningfully shorten time of selection of optimum parameters. • An FEM model was established for the purpose of building a computer simulation. - Abstract: Investigations >The language in this paper has been slightly changed. Please check for clarity of thought, and that the meaning is still correct, and amend if necessary.include Finite Element Method simulation model of remelting of PMHSS6-5-3 high-speed steel surface layer using the high power diode laser (HPDL). The Finite Element Method computations were performed using ANSYS software. The scope of FEM simulation was determination of temperature distribution during laser alloying process at various process configurations regarding the laser beam power and method of powder deposition, as pre-coated past or surface with machined grooves. The Finite Element Method simulation was performed on five different 3-dimensional models. The model assumed nonlinear change of thermal conductivity, specific heat and density that were depended on temperature. The heating process was realized as heat flux corresponding to laser beam power of 1.4, 1.7 and 2.1 kW. Latent heat effects are considered during solidification. The molten pool is composed of the same material as the substrate and there is no chemical reaction. The absorptivity of laser energy was dependent on the simulated materials properties and their surface condition. The Finite Element Method simulation allows specifying the heat affected zone and the temperature distribution in the sample as a function of time and thus allows the estimation of the structural changes taking place during laser remelting process. The simulation was applied to determine the shape of molten pool and the

  17. Surface phase transitions in cu-based solid solutions

    Science.gov (United States)

    Zhevnenko, S. N.; Chernyshikhin, S. V.

    2017-11-01

    We have measured surface energy in two-component Cu-based systems in H2 + Ar gas atmosphere. The experiments on solid Cu [Ag] and Cu [Co] solutions show presence of phase transitions on the surfaces. Isotherms of the surface energy have singularities (the minimum in the case of copper solid solutions with silver and the maximum in the case of solid solutions with cobalt). In both cases, the surface phase transitions cause deficiency of surface miscibility: formation of a monolayer (multilayer) (Cu-Ag) or of nanoscale particles (Cu-Co). At the same time, according to the volume phase diagrams, the concentration and temperature of the surface phase transitions correspond to the solid solution within the volume. The method permits determining the rate of diffusional creep in addition to the surface energy. The temperature and concentration dependence of the solid solutions' viscosity coefficient supports the fact of the surface phase transitions and provides insights into the diffusion properties of the transforming surfaces.

  18. Computational chemistry and metal-based radiopharmaceuticals

    International Nuclear Information System (INIS)

    Neves, M.; Fausto, R.

    1998-01-01

    Computer-assisted techniques have found extensive use in the design of organic pharmaceuticals but have not been widely applied on metal complexes, particularly on radiopharmaceuticals. Some examples of computer generated structures of complexes of In, Ga and Tc with N, S, O and P donor ligands are referred. Besides parameters directly related with molecular geometries, molecular properties of the predicted structures, as ionic charges or dipole moments, are considered to be related with biodistribution studies. The structure of a series of oxo neutral Tc-biguanide complexes are predicted by molecular mechanics calculations, and their interactions with water molecules or peptide chains correlated with experimental data of partition coefficients and percentage of human protein binding. The results stress the interest of using molecular modelling to predict molecular properties of metal-based radiopharmaceuticals, which can be successfully correlated with results of in vitro studies. (author)

  19. Gaussian-Based Smooth Dielectric Function: A Surface-Free Approach for Modeling Macromolecular Binding in Solvents

    Directory of Open Access Journals (Sweden)

    Arghya Chakravorty

    2018-03-01

    Full Text Available Conventional modeling techniques to model macromolecular solvation and its effect on binding in the framework of Poisson-Boltzmann based implicit solvent models make use of a geometrically defined surface to depict the separation of macromolecular interior (low dielectric constant from the solvent phase (high dielectric constant. Though this simplification saves time and computational resources without significantly compromising the accuracy of free energy calculations, it bypasses some of the key physio-chemical properties of the solute-solvent interface, e.g., the altered flexibility of water molecules and that of side chains at the interface, which results in dielectric properties different from both bulk water and macromolecular interior, respectively. Here we present a Gaussian-based smooth dielectric model, an inhomogeneous dielectric distribution model that mimics the effect of macromolecular flexibility and captures the altered properties of surface bound water molecules. Thus, the model delivers a smooth transition of dielectric properties from the macromolecular interior to the solvent phase, eliminating any unphysical surface separating the two phases. Using various examples of macromolecular binding, we demonstrate its utility and illustrate the comparison with the conventional 2-dielectric model. We also showcase some additional abilities of this model, viz. to account for the effect of electrolytes in the solution and to render the distribution profile of water across a lipid membrane.

  20. Design Of Computer Based Test Using The Unified Modeling Language

    Science.gov (United States)

    Tedyyana, Agus; Danuri; Lidyawati

    2017-12-01

    The Admission selection of Politeknik Negeri Bengkalis through interest and talent search (PMDK), Joint Selection of admission test for state Polytechnics (SB-UMPN) and Independent (UM-Polbeng) were conducted by using paper-based Test (PBT). Paper Based Test model has some weaknesses. They are wasting too much paper, the leaking of the questios to the public, and data manipulation of the test result. This reasearch was Aimed to create a Computer-based Test (CBT) models by using Unified Modeling Language (UML) the which consists of Use Case diagrams, Activity diagram and sequence diagrams. During the designing process of the application, it is important to pay attention on the process of giving the password for the test questions before they were shown through encryption and description process. RSA cryptography algorithm was used in this process. Then, the questions shown in the questions banks were randomized by using the Fisher-Yates Shuffle method. The network architecture used in Computer Based test application was a client-server network models and Local Area Network (LAN). The result of the design was the Computer Based Test application for admission to the selection of Politeknik Negeri Bengkalis.

  1. Summary of ground water and surface water flow and contaminant transport computer codes used at the Idaho National Engineering Laboratory (INEL)

    International Nuclear Information System (INIS)

    Bandy, P.J.; Hall, L.F.

    1993-03-01

    This report presents information on computer codes for numerical and analytical models that have been used at the Idaho National Engineering Laboratory (INEL) to model ground water and surface water flow and contaminant transport. Organizations conducting modeling at the INEL include: EG ampersand G Idaho, Inc., US Geological Survey, and Westinghouse Idaho Nuclear Company. Information concerning computer codes included in this report are: agency responsible for the modeling effort, name of the computer code, proprietor of the code (copyright holder or original author), validation and verification studies, applications of the model at INEL, the prime user of the model, computer code description, computing environment requirements, and documentation and references for the computer code

  2. Fault-tolerant measurement-based quantum computing with continuous-variable cluster states.

    Science.gov (United States)

    Menicucci, Nicolas C

    2014-03-28

    A long-standing open question about Gaussian continuous-variable cluster states is whether they enable fault-tolerant measurement-based quantum computation. The answer is yes. Initial squeezing in the cluster above a threshold value of 20.5 dB ensures that errors from finite squeezing acting on encoded qubits are below the fault-tolerance threshold of known qubit-based error-correcting codes. By concatenating with one of these codes and using ancilla-based error correction, fault-tolerant measurement-based quantum computation of theoretically indefinite length is possible with finitely squeezed cluster states.

  3. Surfaces of Minimal Paths from Topological Structures and Applications to 3D Object Segmentation

    KAUST Repository

    Algarni, Marei

    2017-10-24

    Extracting surfaces, representing boundaries of objects of interest, from volumetric images, has important applications in various scientific domains, from medicine to geology. In this thesis, I introduce novel mathematical, computational, and algorithmic machinery for extraction of sheet-like surfaces (with boundary), whose boundary is unknown a-priori, a particularly important case in applications that has no convenient methods. This case of a surface with boundaries has applications in extracting faults (among other geological structures) from seismic images in geological applications. Another application domain is in the extraction of structures in the lung from computed tomography (CT) images. Although many methods have been developed in computer vision for extraction of surfaces, including level sets, convex optimization approaches, and graph cut methods, none of these methods appear to be applicable to the case of surfaces with boundary. The novel methods for surface extraction, derived in this thesis, are built on the theory of Minimal Paths, which has been used primarily to extract curves in noisy or corrupted images and have had wide applicability in 2D computer vision. This thesis extends such methods to surfaces, and it is based on novel observations that surfaces can be determined by extracting topological structures from the solution of the eikonal partial differential equation (PDE), which is the basis of Minimal Path theory. Although topological structures are known to be difficult to extract from images, which are both noisy and discrete, this thesis builds robust methods based on Morse theory and computational topology to address such issues. The algorithms have run-time complexity O(NlogN), less complex than existing approaches. The thesis details the algorithms, theory, and shows an extensive experimental evaluation on seismic images and medical images. Experiments show out-performance in accuracy, computational speed, and user convenience

  4. Simple adaptive sparse representation based classification schemes for EEG based brain-computer interface applications.

    Science.gov (United States)

    Shin, Younghak; Lee, Seungchan; Ahn, Minkyu; Cho, Hohyun; Jun, Sung Chan; Lee, Heung-No

    2015-11-01

    One of the main problems related to electroencephalogram (EEG) based brain-computer interface (BCI) systems is the non-stationarity of the underlying EEG signals. This results in the deterioration of the classification performance during experimental sessions. Therefore, adaptive classification techniques are required for EEG based BCI applications. In this paper, we propose simple adaptive sparse representation based classification (SRC) schemes. Supervised and unsupervised dictionary update techniques for new test data and a dictionary modification method by using the incoherence measure of the training data are investigated. The proposed methods are very simple and additional computation for the re-training of the classifier is not needed. The proposed adaptive SRC schemes are evaluated using two BCI experimental datasets. The proposed methods are assessed by comparing classification results with the conventional SRC and other adaptive classification methods. On the basis of the results, we find that the proposed adaptive schemes show relatively improved classification accuracy as compared to conventional methods without requiring additional computation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Evaluation of Computer Based Testing in lieu of Regular Examinations in Computer Literacy

    Science.gov (United States)

    Murayama, Koichi

    Because computer based testing (CBT) has many advantages compared with the conventional paper and pencil testing (PPT) examination method, CBT has begun to be used in various situations in Japan, such as in qualifying examinations and in the TOEFL. This paper describes the usefulness and the problems of CBT applied to a regular college examination. The regular computer literacy examinations for first year students were held using CBT, and the results were analyzed. Responses to a questionnaire indicated many students accepted CBT with no unpleasantness and considered CBT a positive factor, improving their motivation to study. CBT also decreased the work of faculty in terms of marking tests and reducing data.

  6. 3-D image-based numerical computations of snow permeability: links to specific surface area, density, and microstructural anisotropy

    Directory of Open Access Journals (Sweden)

    N. Calonne

    2012-09-01

    Full Text Available We used three-dimensional (3-D images of snow microstructure to carry out numerical estimations of the full tensor of the intrinsic permeability of snow (K. This study was performed on 35 snow samples, spanning a wide range of seasonal snow types. For several snow samples, a significant anisotropy of permeability was detected and is consistent with that observed for the effective thermal conductivity obtained from the same samples. The anisotropy coefficient, defined as the ratio of the vertical over the horizontal components of K, ranges from 0.74 for a sample of decomposing precipitation particles collected in the field to 1.66 for a depth hoar specimen. Because the permeability is related to a characteristic length, we introduced a dimensionless tensor K*=K/res2, where the equivalent sphere radius of ice grains (res is computed from the specific surface area of snow (SSA and the ice density (ρi as follows: res=3/(SSA×ρi. We define K and K* as the average of the diagonal components of K and K*, respectively. The 35 values of K* were fitted to snow density (ρs and provide the following regression: K = (3.0 ± 0.3 res2 exp((−0.0130 ± 0.0003ρs. We noted that the anisotropy of permeability does not affect significantly the proposed equation. This regression curve was applied to several independent datasets from the literature and compared to other existing regression curves or analytical models. The results show that it is probably the best currently available simple relationship linking the average value of permeability, K, to snow density and specific surface area.

  7. The use of gold nanoparticle aggregation for DNA computing and logic-based biomolecular detection

    International Nuclear Information System (INIS)

    Lee, In-Hee; Yang, Kyung-Ae; Zhang, Byoung-Tak; Lee, Ji-Hoon; Park, Ji-Yoon; Chai, Young Gyu; Lee, Jae-Hoon

    2008-01-01

    The use of DNA molecules as a physical computational material has attracted much interest, especially in the area of DNA computing. DNAs are also useful for logical control and analysis of biological systems if efficient visualization methods are available. Here we present a quick and simple visualization technique that displays the results of the DNA computing process based on a colorimetric change induced by gold nanoparticle aggregation, and we apply it to the logic-based detection of biomolecules. Our results demonstrate its effectiveness in both DNA-based logical computation and logic-based biomolecular detection

  8. Concept of development of integrated computer - based control system for 'Ukryttia' object

    International Nuclear Information System (INIS)

    Buyal'skij, V.M.; Maslov, V.P.

    2003-01-01

    The structural concept of Chernobyl NPP 'Ukryttia' Object's integrated computer - based control system development is presented on the basis of general concept of integrated Computer - based Control System (CCS) design process for organizing and technical management subjects.The concept is aimed at state-of-the-art architectural design technique application and allows using modern computer-aided facilities for functional model,information (logical and physical) models development,as well as for system object model under design

  9. A Geometry Based Infra-Structure for Computational Analysis and Design

    Science.gov (United States)

    Haimes, Robert

    1998-01-01

    The computational steps traditionally taken for most engineering analysis suites (computational fluid dynamics (CFD), structural analysis, heat transfer and etc.) are: (1) Surface Generation -- usually by employing a Computer Assisted Design (CAD) system; (2) Grid Generation -- preparing the volume for the simulation; (3) Flow Solver -- producing the results at the specified operational point; (4) Post-processing Visualization -- interactively attempting to understand the results. For structural analysis, integrated systems can be obtained from a number of commercial vendors. These vendors couple directly to a number of CAD systems and are executed from within the CAD Graphical User Interface (GUI). It should be noted that the structural analysis problem is more tractable than CFD; there are fewer mesh topologies used and the grids are not as fine (this problem space does not have the length scaling issues of fluids). For CFD, these steps have worked well in the past for simple steady-state simulations at the expense of much user interaction. The data was transmitted between phases via files. In most cases, the output from a CAD system could go to Initial Graphics Exchange Specification (IGES) or Standard Exchange Program (STEP) files. The output from Grid Generators and Solvers do not really have standards though there are a couple of file formats that can be used for a subset of the gridding (i.e. PLOT3D data formats). The user would have to patch up the data or translate from one format to another to move to the next step. Sometimes this could take days. Specifically the problems with this procedure are:(1) File based -- Information flows from one step to the next via data files with formats specified for that procedure. File standards, when they exist, are wholly inadequate. For example, geometry from CAD systems (transmitted via IGES files) is defined as disjoint surfaces and curves (as well as masses of other information of no interest for the Grid Generator

  10. Computational Redox Potential Predictions: Applications to Inorganic and Organic Aqueous Complexes, and Complexes Adsorbed to Mineral Surfaces

    Directory of Open Access Journals (Sweden)

    Krishnamoorthy Arumugam

    2014-04-01

    Full Text Available Applications of redox processes range over a number of scientific fields. This review article summarizes the theory behind the calculation of redox potentials in solution for species such as organic compounds, inorganic complexes, actinides, battery materials, and mineral surface-bound-species. Different computational approaches to predict and determine redox potentials of electron transitions are discussed along with their respective pros and cons for the prediction of redox potentials. Subsequently, recommendations are made for certain necessary computational settings required for accurate calculation of redox potentials. This article reviews the importance of computational parameters, such as basis sets, density functional theory (DFT functionals, and relativistic approaches and the role that physicochemical processes play on the shift of redox potentials, such as hydration or spin orbit coupling, and will aid in finding suitable combinations of approaches for different chemical and geochemical applications. Identifying cost-effective and credible computational approaches is essential to benchmark redox potential calculations against experiments. Once a good theoretical approach is found to model the chemistry and thermodynamics of the redox and electron transfer process, this knowledge can be incorporated into models of more complex reaction mechanisms that include diffusion in the solute, surface diffusion, and dehydration, to name a few. This knowledge is important to fully understand the nature of redox processes be it a geochemical process that dictates natural redox reactions or one that is being used for the optimization of a chemical process in industry. In addition, it will help identify materials that will be useful to design catalytic redox agents, to come up with materials to be used for batteries and photovoltaic processes, and to identify new and improved remediation strategies in environmental engineering, for example the

  11. Modifications of Surface Wave Discrimination Filter Based on the Polarization Properties

    International Nuclear Information System (INIS)

    Kutlu, Y. A.; Sayil, N.

    2007-01-01

    The polarization properties of Love and Rayleigh waves are utilized to design Surface Wave Discrimination Filter. Filtering process for a selected window length and moving interval is that the amplitudes at each frequency on vertical, radial and transverse components are weighted according to how closely the theoretical three-dimensional particle motion pattern. In this study, weighted functions have been modified for epicenteral distances smaller than about 2200 km to corresponding with angular distribution of polarization parameters obtained from computed synthetic seismograms. Modified Surface Wave Discrimination Filter has been tested on synthetic seismograms and digital three-components broadband records at Trabzon earthquake station

  12. Interactive computer-assisted instruction in acid-base physiology for mobile computer platforms.

    Science.gov (United States)

    Longmuir, Kenneth J

    2014-03-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ∼20 screens of information, on the subjects of the CO2-bicarbonate buffer system, other body buffer systems, and acid-base disorders. Five clinical case modules were also developed. For the learning modules, the interactive, active learning activities were primarily step-by-step learner control of explanations of complex physiological concepts, usually presented graphically. For the clinical cases, the active learning activities were primarily question-and-answer exercises that related clinical findings to the relevant basic science concepts. The student response was remarkably positive, with the interactive, active learning aspect of the instruction cited as the most important feature. Also, students cited the self-paced instruction, extensive use of interactive graphics, and side-by-side presentation of text and graphics as positive features. Most students reported that it took less time to study the subject matter with this online instruction compared with subject matter presented in the lecture hall. However, the approach to learning was highly examination driven, with most students delaying the study of the subject matter until a few days before the scheduled examination. Wider implementation of active learning computer-assisted instruction will require that instructors present subject matter interactively, that students fully embrace the responsibilities of independent learning, and that institutional administrations measure instructional effort by criteria other than scheduled hours of instruction.

  13. Computer-Based Methods for Collecting Peer Nomination Data: Utility, Practice, and Empirical Support.

    Science.gov (United States)

    van den Berg, Yvonne H M; Gommans, Rob

    2017-09-01

    New technologies have led to several major advances in psychological research over the past few decades. Peer nomination research is no exception. Thanks to these technological innovations, computerized data collection is becoming more common in peer nomination research. However, computer-based assessment is more than simply programming the questionnaire and asking respondents to fill it in on computers. In this chapter the advantages and challenges of computer-based assessments are discussed. In addition, a list of practical recommendations and considerations is provided to inform researchers on how computer-based methods can be applied to their own research. Although the focus is on the collection of peer nomination data in particular, many of the requirements, considerations, and implications are also relevant for those who consider the use of other sociometric assessment methods (e.g., paired comparisons, peer ratings, peer rankings) or computer-based assessments in general. © 2017 Wiley Periodicals, Inc.

  14. Segmentation process significantly influences the accuracy of 3D surface models derived from cone beam computed tomography

    NARCIS (Netherlands)

    Fourie, Zacharias; Damstra, Janalt; Schepers, Rutger H; Gerrits, Pieter; Ren, Yijin

    AIMS: To assess the accuracy of surface models derived from 3D cone beam computed tomography (CBCT) with two different segmentation protocols. MATERIALS AND METHODS: Seven fresh-frozen cadaver heads were used. There was no conflict of interests in this study. CBCT scans were made of the heads and 3D

  15. Quantum computing with Majorana fermion codes

    Science.gov (United States)

    Litinski, Daniel; von Oppen, Felix

    2018-05-01

    We establish a unified framework for Majorana-based fault-tolerant quantum computation with Majorana surface codes and Majorana color codes. All logical Clifford gates are implemented with zero-time overhead. This is done by introducing a protocol for Pauli product measurements with tetrons and hexons which only requires local 4-Majorana parity measurements. An analogous protocol is used in the fault-tolerant setting, where tetrons and hexons are replaced by Majorana surface code patches, and parity measurements are replaced by lattice surgery, still only requiring local few-Majorana parity measurements. To this end, we discuss twist defects in Majorana fermion surface codes and adapt the technique of twist-based lattice surgery to fermionic codes. Moreover, we propose a family of codes that we refer to as Majorana color codes, which are obtained by concatenating Majorana surface codes with small Majorana fermion codes. Majorana surface and color codes can be used to decrease the space overhead and stabilizer weight compared to their bosonic counterparts.

  16. Computational-Model-Based Analysis of Context Effects on Harmonic Expectancy.

    Science.gov (United States)

    Morimoto, Satoshi; Remijn, Gerard B; Nakajima, Yoshitaka

    2016-01-01

    Expectancy for an upcoming musical chord, harmonic expectancy, is supposedly based on automatic activation of tonal knowledge. Since previous studies implicitly relied on interpretations based on Western music theory, the underlying computational processes involved in harmonic expectancy and how it relates to tonality need further clarification. In particular, short chord sequences which cannot lead to unique keys are difficult to interpret in music theory. In this study, we examined effects of preceding chords on harmonic expectancy from a computational perspective, using stochastic modeling. We conducted a behavioral experiment, in which participants listened to short chord sequences and evaluated the subjective relatedness of the last chord to the preceding ones. Based on these judgments, we built stochastic models of the computational process underlying harmonic expectancy. Following this, we compared the explanatory power of the models. Our results imply that, even when listening to short chord sequences, internally constructed and updated tonal assumptions determine the expectancy of the upcoming chord.

  17. SurfaceSlide: a multitouch digital pathology platform.

    Directory of Open Access Journals (Sweden)

    Yinhai Wang

    Full Text Available BACKGROUND: Digital pathology provides a digital environment for the management and interpretation of pathological images and associated data. It is becoming increasing popular to use modern computer based tools and applications in pathological education, tissue based research and clinical diagnosis. Uptake of this new technology is stymied by its single user orientation and its prerequisite and cumbersome combination of mouse and keyboard for navigation and annotation. METHODOLOGY: In this study we developed SurfaceSlide, a dedicated viewing platform which enables the navigation and annotation of gigapixel digitised pathological images using fingertip touch. SurfaceSlide was developed using the Microsoft Surface, a 30 inch multitouch tabletop computing platform. SurfaceSlide users can perform direct panning and zooming operations on digitised slide images. These images are downloaded onto the Microsoft Surface platform from a remote server on-demand. Users can also draw annotations and key in texts using an on-screen virtual keyboard. We also developed a smart caching protocol which caches the surrounding regions of a field of view in multi-resolutions thus providing a smooth and vivid user experience and reducing the delay for image downloading from the internet. We compared the usability of SurfaceSlide against Aperio ImageScope and PathXL online viewer. CONCLUSION: SurfaceSlide is intuitive, fast and easy to use. SurfaceSlide represents the most direct, effective and intimate human-digital slide interaction experience. It is expected that SurfaceSlide will significantly enhance digital pathology tools and applications in education and clinical practice.

  18. Dataflow-Based Mapping of Computer Vision Algorithms onto FPGAs

    Directory of Open Access Journals (Sweden)

    Ivan Corretjer

    2007-01-01

    Full Text Available We develop a design methodology for mapping computer vision algorithms onto an FPGA through the use of coarse-grain reconfigurable dataflow graphs as a representation to guide the designer. We first describe a new dataflow modeling technique called homogeneous parameterized dataflow (HPDF, which effectively captures the structure of an important class of computer vision applications. This form of dynamic dataflow takes advantage of the property that in a large number of image processing applications, data production and consumption rates can vary, but are equal across dataflow graph edges for any particular application iteration. After motivating and defining the HPDF model of computation, we develop an HPDF-based design methodology that offers useful properties in terms of verifying correctness and exposing performance-enhancing transformations; we discuss and address various challenges in efficiently mapping an HPDF-based application representation into target-specific HDL code; and we present experimental results pertaining to the mapping of a gesture recognition application onto the Xilinx Virtex II FPGA.

  19. A nodally condensed SUPG formulation for free-surface computation of steady-state flows constrained by unilateral contact - Application to rolling

    Science.gov (United States)

    Arora, Shitij; Fourment, Lionel

    2018-05-01

    In the context of the simulation of industrial hot forming processes, the resultant time-dependent thermo-mechanical multi-field problem (v →,p ,σ ,ɛ ) can be sped up by 10-50 times using the steady-state methods while compared to the conventional incremental methods. Though the steady-state techniques have been used in the past, but only on simple configurations and with structured meshes, and the modern-days problems are in the framework of complex configurations, unstructured meshes and parallel computing. These methods remove time dependency from the equations, but introduce an additional unknown into the problem: the steady-state shape. This steady-state shape x → can be computed as a geometric correction t → on the domain X → by solving the weak form of the steady-state equation v →.n →(t →)=0 using a Streamline Upwind Petrov Galerkin (SUPG) formulation. There exists a strong coupling between the domain shape and the material flow, hence, a two-step fixed point iterative resolution algorithm was proposed that involves (1) the computation of flow field from the resolution of thermo-mechanical equations on a prescribed domain shape and (2) the computation of steady-state shape for an assumed velocity field. The contact equations are introduced in the penalty form both during the flow computation as well as during the free-surface correction. The fact that the contact description is inhomogeneous, i.e., it is defined in the nodal form in the former, and in the weighted residual form in the latter, is assumed to be critical to the convergence of certain problems. Thus, the notion of nodal collocation is invoked in the weak form of the surface correction equation to homogenize the contact coupling. The surface correction algorithm is tested on certain analytical test cases and the contact coupling is tested with some hot rolling problems.

  20. Laser-Based Surface Modification of Microstructure for Carbon Fiber-Reinforced Plastics

    Science.gov (United States)

    Yang, Wenfeng; Sun, Ting; Cao, Yu; Li, Shaolong; Liu, Chang; Tang, Qingru

    2018-05-01

    Bonding repair is a powerful feature of carbon fiber-reinforced plastics (CFRP). Based on the theory of interface bonding, the interface adhesion strength and reliability of the CFRP structure will be directly affected by the microscopic features of the CFRP surface, including the microstructure, physical, and chemical characteristics. In this paper, laser-based surface modification was compared to Peel-ply, grinding, and polishing to comparatively evaluate the surface microstructure of CFRP. The surface microstructure, morphology, fiber damage, height and space parameters were investigated by scanning electron microscopy (SEM) and laser confocal microscopy (LCM). Relative to the conventional grinding process, laser modification of the CFRP surface can result in more uniform resin removal and better processing control and repeatability. This decreases the adverse impact of surface fiber fractures and secondary damage. The surface properties were significantly optimized, which has been reflected such things as the obvious improvement of surface roughness, microstructure uniformity, and actual area. The improved surface microstructure based on laser modification is more conducive to interface bonding of CFRP structure repair. This can enhance the interfacial adhesion strength and reliability of repair.

  1. Situation awareness and trust in computer-based procedures in nuclear power plant operations

    Energy Technology Data Exchange (ETDEWEB)

    Throneburg, E. B.; Jones, J. M. [AREVA NP Inc., 7207 IBM Drive, Charlotte, NC 28262 (United States)

    2006-07-01

    Situation awareness and trust are two issues that need to be addressed in the design of computer-based procedures for nuclear power plants. Situation awareness, in relation to computer-based procedures, concerns the operators' knowledge of the plant's state while following the procedures. Trust concerns the amount of faith that the operators put into the automated procedures, which can affect situation awareness. This paper first discusses the advantages and disadvantages of computer-based procedures. It then discusses the known aspects of situation awareness and trust as applied to computer-based procedures in nuclear power plants. An outline of a proposed experiment is then presented that includes methods of measuring situation awareness and trust so that these aspects can be analyzed for further study. (authors)

  2. Situation awareness and trust in computer-based procedures in nuclear power plant operations

    International Nuclear Information System (INIS)

    Throneburg, E. B.; Jones, J. M.

    2006-01-01

    Situation awareness and trust are two issues that need to be addressed in the design of computer-based procedures for nuclear power plants. Situation awareness, in relation to computer-based procedures, concerns the operators' knowledge of the plant's state while following the procedures. Trust concerns the amount of faith that the operators put into the automated procedures, which can affect situation awareness. This paper first discusses the advantages and disadvantages of computer-based procedures. It then discusses the known aspects of situation awareness and trust as applied to computer-based procedures in nuclear power plants. An outline of a proposed experiment is then presented that includes methods of measuring situation awareness and trust so that these aspects can be analyzed for further study. (authors)

  3. Students' Motivation toward Computer-Based Language Learning

    Science.gov (United States)

    Genc, Gulten; Aydin, Selami

    2011-01-01

    The present article examined some factors affecting the motivation level of the preparatory school students in using a web-based computer-assisted language-learning course. The sample group of the study consisted of 126 English-as-a-foreign-language learners at a preparatory school of a state university. After performing statistical analyses…

  4. Neural Computation of Surface Border Ownership and Relative Surface Depth from Ambiguous Contrast Inputs.

    Science.gov (United States)

    Dresp-Langley, Birgitta; Grossberg, Stephen

    2016-01-01

    The segregation of image parts into foreground and background is an important aspect of the neural computation of 3D scene perception. To achieve such segregation, the brain needs information about border ownership; that is, the belongingness of a contour to a specific surface represented in the image. This article presents psychophysical data derived from 3D percepts of figure and ground that were generated by presenting 2D images composed of spatially disjoint shapes that pointed inward or outward relative to the continuous boundaries that they induced along their collinear edges. The shapes in some images had the same contrast (black or white) with respect to the background gray. Other images included opposite contrasts along each induced continuous boundary. Psychophysical results demonstrate conditions under which figure-ground judgment probabilities in response to these ambiguous displays are determined by the orientation of contrasts only, not by their relative contrasts, despite the fact that many border ownership cells in cortical area V2 respond to a preferred relative contrast. Studies are also reviewed in which both polarity-specific and polarity-invariant properties obtain. The FACADE and 3D LAMINART models are used to explain these data.

  5. Neural computation of surface border ownership and relative surface depth from ambiguous contrast inputs

    Directory of Open Access Journals (Sweden)

    Birgitta Dresp-Langley

    2016-07-01

    Full Text Available The segregation of image parts into foreground and background is an important aspect of the neural computation of 3D scene perception. To achieve such segregation, the brain needs information about border ownership; that is, the belongingness of a contour to a specific surface represented in the image. This article presents psychophysical data derived from 3D percepts of figure and ground that were generated by presenting 2D images composed of spatially disjoint shapes that pointed inward or outward relative to the continuous boundaries that they induced along their collinear edges. The shapes in some images had the same contrast (black or white with respect to the background gray. Other images included opposite contrasts along each induced continuous boundary. Results demonstrate conditions under which figure-ground judgment probabilities in response to these ambiguous displays are determined by the orientation of contrasts only, not by their relative contrasts, despite the fact that many border ownership cells in cortical area V2 respond to a preferred relative contrast. Studies are also reviewed in which both polarity-specific and polarity-invariant properties obtain. The FACADE and 3D LAMINART models are used to explain these data.

  6. Neural Computation of Surface Border Ownership and Relative Surface Depth from Ambiguous Contrast Inputs

    Science.gov (United States)

    Dresp-Langley, Birgitta; Grossberg, Stephen

    2016-01-01

    The segregation of image parts into foreground and background is an important aspect of the neural computation of 3D scene perception. To achieve such segregation, the brain needs information about border ownership; that is, the belongingness of a contour to a specific surface represented in the image. This article presents psychophysical data derived from 3D percepts of figure and ground that were generated by presenting 2D images composed of spatially disjoint shapes that pointed inward or outward relative to the continuous boundaries that they induced along their collinear edges. The shapes in some images had the same contrast (black or white) with respect to the background gray. Other images included opposite contrasts along each induced continuous boundary. Psychophysical results demonstrate conditions under which figure-ground judgment probabilities in response to these ambiguous displays are determined by the orientation of contrasts only, not by their relative contrasts, despite the fact that many border ownership cells in cortical area V2 respond to a preferred relative contrast. Studies are also reviewed in which both polarity-specific and polarity-invariant properties obtain. The FACADE and 3D LAMINART models are used to explain these data. PMID:27516746

  7. Inference-Based Surface Reconstruction of Cluttered Environments

    KAUST Repository

    Biggers, K.

    2012-08-01

    We present an inference-based surface reconstruction algorithm that is capable of identifying objects of interest among a cluttered scene, and reconstructing solid model representations even in the presence of occluded surfaces. Our proposed approach incorporates a predictive modeling framework that uses a set of user-provided models for prior knowledge, and applies this knowledge to the iterative identification and construction process. Our approach uses a local to global construction process guided by rules for fitting high-quality surface patches obtained from these prior models. We demonstrate the application of this algorithm on several example data sets containing heavy clutter and occlusion. © 2012 IEEE.

  8. Computer Game-Based Learning: Perceptions and Experiences of Senior Chinese Adults

    Science.gov (United States)

    Wang, Feihong; Lockee, Barbara B.; Burton, John K.

    2012-01-01

    The purpose of this study was to investigate senior Chinese adults' potential acceptance of computer game-based learning (CGBL) by probing their perceptions of computer game play and their perceived impacts of game play on their learning of computer skills and life satisfaction. A total of 60 senior adults from a local senior adult learning center…

  9. GPU-based cone beam computed tomography.

    Science.gov (United States)

    Noël, Peter B; Walczak, Alan M; Xu, Jinhui; Corso, Jason J; Hoffmann, Kenneth R; Schafer, Sebastian

    2010-06-01

    The use of cone beam computed tomography (CBCT) is growing in the clinical arena due to its ability to provide 3D information during interventions, its high diagnostic quality (sub-millimeter resolution), and its short scanning times (60 s). In many situations, the short scanning time of CBCT is followed by a time-consuming 3D reconstruction. The standard reconstruction algorithm for CBCT data is the filtered backprojection, which for a volume of size 256(3) takes up to 25 min on a standard system. Recent developments in the area of Graphic Processing Units (GPUs) make it possible to have access to high-performance computing solutions at a low cost, allowing their use in many scientific problems. We have implemented an algorithm for 3D reconstruction of CBCT data using the Compute Unified Device Architecture (CUDA) provided by NVIDIA (NVIDIA Corporation, Santa Clara, California), which was executed on a NVIDIA GeForce GTX 280. Our implementation results in improved reconstruction times from minutes, and perhaps hours, to a matter of seconds, while also giving the clinician the ability to view 3D volumetric data at higher resolutions. We evaluated our implementation on ten clinical data sets and one phantom data set to observe if differences occur between CPU and GPU-based reconstructions. By using our approach, the computation time for 256(3) is reduced from 25 min on the CPU to 3.2 s on the GPU. The GPU reconstruction time for 512(3) volumes is 8.5 s. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  10. Surface pressure and aerodynamic loads determination of a transonic airfoil based on particle image velocimetry

    International Nuclear Information System (INIS)

    Ragni, D; Ashok, A; Van Oudheusden, B W; Scarano, F

    2009-01-01

    The present investigation assesses a procedure to extract the aerodynamic loads and pressure distribution on an airfoil in the transonic flow regime from particle image velocimetry (PIV) measurements. The wind tunnel model is a two-dimensional NACA-0012 airfoil, and the PIV velocity data are used to evaluate pressure fields, whereas lift and drag coefficients are inferred from the evaluation of momentum contour and wake integrals. The PIV-based results are compared to those derived from conventional loads determination procedures involving surface pressure transducers and a wake rake. The method applied in this investigation is an extension to the compressible flow regime of that considered by van Oudheusden et al (2006 Non-intrusive load characterization of an airfoil using PIV Exp. Fluids 40 988–92) at low speed conditions. The application of a high-speed imaging system allows the acquisition in relatively short time of a sufficient ensemble size to compute converged velocity statistics, further translated in turbulent fluctuations included in the pressure and loads calculation, notwithstanding their verified negligible influence in the computation. Measurements are performed at varying spatial resolution to optimize the loads determination in the wake region and around the airfoil, further allowing us to assess the influence of spatial resolution in the proposed procedure. Specific interest is given to the comparisons between the PIV-based method and the conventional procedures for determining the pressure coefficient on the surface, the drag and lift coefficients at different angles of attack. Results are presented for the experiments at a free-stream Mach number M = 0.6, with the angle of attack ranging from 0° to 8°

  11. A waveless free surface flow past a submerged triangular obstacle in presence of surface tension

    Directory of Open Access Journals (Sweden)

    Hakima Sekhri

    2016-07-01

    Full Text Available We consider the Free surface flows passing a submerged triangular obstacle at the bottom of a channel. The problem is characterized by a nonlinear boundary condition on the surface of unknown configuration. The analytical exact solutions for these problems are not known. Following Dias and Vanden Broeck [6], we computed numerically the solutions via a series truncation method. These solutions depend on two parameters: the Weber number $\\alpha$ characterizing the strength of the surface tension and the angle $\\beta$ at the base characterizing the shape of the apex. Although free surface flows with surface tension admit capillary waves, it is found that solution exist only for values of the Weber number greater than $\\alpha_0$ for different configurations of the triangular obstacle.

  12. A Simulation-Based Soft Error Estimation Methodology for Computer Systems

    OpenAIRE

    Sugihara, Makoto; Ishihara, Tohru; Hashimoto, Koji; Muroyama, Masanori

    2006-01-01

    This paper proposes a simulation-based soft error estimation methodology for computer systems. Accumulating soft error rates (SERs) of all memories in a computer system results in pessimistic soft error estimation. This is because memory cells are used spatially and temporally and not all soft errors in them make the computer system faulty. Our soft-error estimation methodology considers the locations and the timings of soft errors occurring at every level of memory hierarchy and estimates th...

  13. Comparative evaluation of tensile bond strength of silicone-based denture liners after thermocycling and surface treatment

    Directory of Open Access Journals (Sweden)

    Harsimran Kaur

    2015-01-01

    Full Text Available Purpose: To examine, evaluate, and compare the tensile bond strength of two silicone-based liners; one autopolymerizing and one heat cured, when treated with different chemical etchants to improve their adhesion with denture base resin. Materials and Methods: Hundred and sixty test specimens of heat-cured polymethyl methacrylate (PMMA were fabricated; out of which 80 specimens were tested for tensile bond strength after bonding it to autopolymerizing resilient liner (Ufigel P and rest 80 to heat-cured resilient liner (Molloplast B. Each main group was further divided into four subgroups of 20 specimens each, one to act as a control and three were subjected to surface treatment with different chemical etchants namely dichloromethane, MMA monomer, and chloroform. The two silicone-based denture liners were processed between 2 PMMA specimens (10 mm × 10 mm × 40 mm in the space provided by a spacer of 3 mm, thermocycled (5-55°C for 500 cycles, and then their tensile strength measurements were done in the universal testing machine. Results: One-way ANOVA technique showed a highly significant difference in the mean tensile bond strength values for all the groups. The Student′s t-test computed values of statistics for the compared groups were greater than the critical values both at 5% and at 1% levels. Conclusion: Surface treatment of denture base resin with chemical etchants prior to the application of silicone-based liner (Ufigel P and Molloplast-B increased the tensile bond strength. The increase was the highest with specimens subjected to 180 s of MMA surface treatment and the lowest with control group specimens.

  14. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  15. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  16. Educational Games for Early Childhood: Using Tabletop Surface Computers for Teaching the Arabic Alphabet

    DEFF Research Database (Denmark)

    Papadopoulos, Pantelis M.; Ibrahim, Zeinab; Karatsolis, Andreas

    2015-01-01

    This paper presents initial evaluation regarding the use of simple educational games on tabletop surface computers to teach Kindergarten students in Qatar the Arabic alphabet. This effort is part of the “Arabiyyatii” research project, a 3-year endeavor aimed to teach 5-year-olds Modern Standard...... to the students, along with data collected from system log files and class observations. Result analysis suggests that these kinds of games could be useful in (a) enhancing students’ engagement in language learning, (b) increasing their exposure to MSA, and (c) developing their vocabulary....... Arabic (MSA). The paper describes a naturalistic study design, following the activities of 18 students for a period of 9 weeks in the project. All students were native speakers of the Qatari dialect and they were early users of similar surface technologies. The paper presents three of the games available...

  17. Glossiness of Colored Papers based on Computer Graphics Model and Its Measuring Method

    Science.gov (United States)

    Aida, Teizo

    In the case of colored papers, the color of surface effects strongly upon the gloss of its paper. The new glossiness for such a colored paper is suggested in this paper. First, using the Achromatic and Chromatic Munsell colored chips, the author obtained experimental equation which represents the relation between lightness V ( or V and saturation C ) and psychological glossiness Gph of these chips. Then, the author defined a new glossiness G for the colored papers, based on the above mentioned experimental equations Gph and Cook-Torrance's reflection model which are widely used in the filed of Computer Graphics. This new glossiness is shown to be nearly proportional to the psychological glossiness Gph. The measuring system for the new glossiness G is furthermore descrived. The measuring time for one specimen is within 1 minute.

  18. Real Time Animation of Trees Based on BBSC in Computer Games

    Directory of Open Access Journals (Sweden)

    Xuefeng Ao

    2009-01-01

    Full Text Available That researchers in the field of computer games usually find it is difficult to simulate the motion of actual 3D model trees lies in the fact that the tree model itself has very complicated structure, and many sophisticated factors need to be considered during the simulation. Though there are some works on simulating 3D tree and its motion, few of them are used in computer games due to the high demand for real-time in computer games. In this paper, an approach of animating trees in computer games based on a novel tree model representation—Ball B-Spline Curves (BBSCs are proposed. By taking advantage of the good features of the BBSC-based model, physical simulation of the motion of leafless trees with wind blowing becomes easier and more efficient. The method can generate realistic 3D tree animation in real-time, which meets the high requirement for real time in computer games.

  19. A computationally efficient 3D finite-volume scheme for violent liquid–gas sloshing

    CSIR Research Space (South Africa)

    Oxtoby, Oliver F

    2015-10-01

    Full Text Available We describe a semi-implicit volume-of-fluid free-surface-modelling methodology for flow problems involving violent free-surface motion. For efficient computation, a hybrid-unstructured edge-based vertex-centred finite volume discretisation...

  20. Confidential benchmarking based on multiparty computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Damgård, Kasper Lyneborg; Nielsen, Kurt

    We report on the design and implementation of a system that uses multiparty computation to enable banks to benchmark their customers' confidential performance data against a large representative set of confidential performance data from a consultancy house. The system ensures that both the banks......' and the consultancy house's data stays confidential, the banks as clients learn nothing but the computed benchmarking score. In the concrete business application, the developed prototype help Danish banks to find the most efficient customers among a large and challenging group of agricultural customers with too much...... debt. We propose a model based on linear programming for doing the benchmarking and implement it using the SPDZ protocol by Damgård et al., which we modify using a new idea that allows clients to supply data and get output without having to participate in the preprocessing phase and without keeping...