WorldWideScience

Sample records for public domain computer

  1. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Science.gov (United States)

    2010-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... GENERAL PROVISIONS § 201.26 Recordation of documents pertaining to computer shareware and donation of public domain computer software. (a) General. This section prescribes the procedures for submission of...

  2. PUBLIC DOMAIN PROTECTION. USES AND REUSES OF PUBLIC DOMAIN WORKS

    Directory of Open Access Journals (Sweden)

    Monica Adriana LUPAȘCU

    2015-07-01

    Full Text Available This study tries to highlight the necessity of an awareness of the right of access to the public domain, particularly using the example of works whose protection period has expired, as well as the ones which the law considers to be excluded from protection. Such works are used not only by large libraries from around the world, but also by rights holders, via different means of use, including incorporations into original works or adaptations. However, the reuse that follows these uses often only remains at the level of concept, as the notion of the public’s right of access to public domain works is not substantiated, nor is the notion of the correct or legal use of such works.

  3. Public Domain; Public Interest; Public Funding: Focussing on the ‘three Ps’ in Scientific Research

    Directory of Open Access Journals (Sweden)

    Mags McGinley

    2005-03-01

    Full Text Available The purpose of this paper is to discuss the ‘three Ps’ of scientific research: Public Domain; Public Interest; Public Funding. This is done by examining some of the difficulties faced by scientists engaged in scientific research who may have problems working within the constraints of current copyright and database legislation, where property claims can place obstacles in the way of research, in other words, the public domain. The article then looks at perceptions of the public interest and asks whether copyright and the database right reflect understandings of how this concept should operate. Thirdly, it considers the relevance of public funding for scientific research in the context of both the public domain and of the public interest. Finally, some recent initiatives seeking to change the contours of the legal framework are be examined.

  4. A Domain-Specific Programming Language for Secure Multiparty Computation

    DEFF Research Database (Denmark)

    Nielsen, Janus Dam; Schwartzbach, Michael Ignatieff

    2007-01-01

    We present a domain-specific programming language for Secure Multiparty Computation (SMC). Information is a resource of vital importance and considerable economic value to individuals, public administration, and private companies. This means that the confidentiality of information is crucial...... on secret values and results are only revealed according to specific protocols. We identify the key linguistic concepts of SMC and bridge the gap between high-level security requirements and low-level cryptographic operations constituting an SMC platform, thus improving the efficiency and security of SMC...

  5. Violence defied? : A review of prevention of violence in public and semi-public domain

    NARCIS (Netherlands)

    Knaap, L.M. van der; Nijssen, L.T.J.; Bogaerts, S.

    2006-01-01

    This report provides a synthesis of 48 studies of the effects of the prevention of violence in the public and semi-public domain. The following research questions were states for this study:What measures for the prevention of violence in the public and semi-public domain are known and have been

  6. Public licenses and public domain as alternatives to copyright

    OpenAIRE

    Köppel, Petr

    2012-01-01

    The work first introduces the area of public licenses as a space between the copyright law and public domain. After that, consecutively for proprietary software, free and open source software, open hardware and open content, it maps particular types of public licenses and the accompanying social and cultural movements, puts them in mutual as well as historical context, examines their characteristics and compares them to each other, shows how the public licenses are defined by various accompan...

  7. Preserving the positive functions of the public domain in science

    Directory of Open Access Journals (Sweden)

    Pamela Samuelson

    2003-11-01

    Full Text Available Science has advanced in part because data and scientific methodologies have traditionally not been subject to intellectual property protection. In recent years, intellectual property has played a greater role in scientific work. While intellectual property rights may have a positive role to play in some fields of science, so does the public domain. This paper will discuss some of the positive functions of the public domain and ways in which certain legal developments may negatively impact the public domain. It suggests some steps that scientists can take to preserve the positive functions of the public domain for science.

  8. The Definition, Dimensions, and Domain of Public Relations.

    Science.gov (United States)

    Hutton, James G.

    1999-01-01

    Discusses how the field of public relations has left itself vulnerable to other fields that are making inroads into public relations' traditional domain, and to critics who are filling in their own definitions of public relations. Proposes a definition and a three-dimensional framework to compare competing philosophies of public relations and to…

  9. Fast resolution of the neutron diffusion equation through public domain Ode codes

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, V.M.; Vidal, V.; Garayoa, J. [Universidad Politecnica de Valencia, Departamento de Sistemas Informaticos, Valencia (Spain); Verdu, G. [Universidad Politecnica de Valencia, Departamento de Ingenieria Quimica y Nuclear, Valencia (Spain); Gomez, R. [I.E.S. de Tavernes Blanques, Valencia (Spain)

    2003-07-01

    The time-dependent neutron diffusion equation is a partial differential equation with source terms. The resolution method usually includes discretizing the spatial domain, obtaining a large system of linear, stiff ordinary differential equations (ODEs), whose resolution is computationally very expensive. Some standard techniques use a fixed time step to solve the ODE system. This can result in errors (if the time step is too large) or in long computing times (if the time step is too little). To speed up the resolution method, two well-known public domain codes have been selected: DASPK and FCVODE that are powerful codes for the resolution of large systems of stiff ODEs. These codes can estimate the error after each time step, and, depending on this estimation can decide which is the new time step and, possibly, which is the integration method to be used in the next step. With these mechanisms, it is possible to keep the overall error below the chosen tolerances, and, when the system behaves smoothly, to take large time steps increasing the execution speed. In this paper we address the use of the public domain codes DASPK and FCVODE for the resolution of the time-dependent neutron diffusion equation. The efficiency of these codes depends largely on the preconditioning of the big systems of linear equations that must be solved. Several pre-conditioners have been programmed and tested; it was found that the multigrid method is the best of the pre-conditioners tested. Also, it has been found that DASPK has performed better than FCVODE, being more robust for our problem.We can conclude that the use of specialized codes for solving large systems of ODEs can reduce drastically the computational work needed for the solution; and combining them with appropriate pre-conditioners, the reduction can be still more important. It has other crucial advantages, since it allows the user to specify the allowed error, which cannot be done in fixed step implementations; this, of course

  10. Human-computer interface incorporating personal and application domains

    Science.gov (United States)

    Anderson, Thomas G [Albuquerque, NM

    2011-03-29

    The present invention provides a human-computer interface. The interface includes provision of an application domain, for example corresponding to a three-dimensional application. The user is allowed to navigate and interact with the application domain. The interface also includes a personal domain, offering the user controls and interaction distinct from the application domain. The separation into two domains allows the most suitable interface methods in each: for example, three-dimensional navigation in the application domain, and two- or three-dimensional controls in the personal domain. Transitions between the application domain and the personal domain are under control of the user, and the transition method is substantially independent of the navigation in the application domain. For example, the user can fly through a three-dimensional application domain, and always move to the personal domain by moving a cursor near one extreme of the display.

  11. Computing the Feng-Rao distances for codes from order domains

    DEFF Research Database (Denmark)

    Ruano Benito, Diego

    2007-01-01

    We compute the Feng–Rao distance of a code coming from an order domain with a simplicial value semigroup. The main tool is the Apéry set of a semigroup that can be computed using a Gröbner basis.......We compute the Feng–Rao distance of a code coming from an order domain with a simplicial value semigroup. The main tool is the Apéry set of a semigroup that can be computed using a Gröbner basis....

  12. Computer Science and Technology Publications. NBS Publications List 84.

    Science.gov (United States)

    National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.

    This bibliography lists publications of the Institute for Computer Sciences and Technology of the National Bureau of Standards. Publications are listed by subject in the areas of computer security, computer networking, and automation technology. Sections list publications of: (1) current Federal Information Processing Standards; (2) computer…

  13. Cultural Heritage and the Public Domain

    Directory of Open Access Journals (Sweden)

    Bas Savenije

    2012-09-01

    by providing their resources on the Internet” (Berlin Declaration 2003. Therefore, in the spirit of the Berlin Declaration, the ARL encourages its members’ libraries to grant all non-commercial users “a free, irrevocable, worldwide, right of access to, and a license to copy, use, distribute, transmit and display the work publicly and to make and distribute derivative works, in any digital medium for any responsible purpose, subject to proper attribution of authorship”. And: “If fees are to be assessed for the use of digitised public domain works, those fees should only apply to commercial uses” (ARL Principles July 2010. In our view, cultural heritage institutions should make public domain material digitised with public funding as widely available as possible for access and reuse. The public sector has the primary responsibility to fund digitisation. The involvement of private partners, however, is encouraged by ARL as well as the Comité des Sages. Private funding for digitisation is a complement to the necessary public investment, especially in times of economic crisis, but should not be seen as a substitute for public funding. As we can see from these reports there are a number of arguments in favour of digitisation and also of providing maximum accessibility to the digitised cultural heritage. In this paper we will investigate the legal aspects of digitisation of cultural heritage, especially public domain material. On the basis of these we will make an inventory of policy considerations regarding reuse. Furthermore, we will describe the conclusions the National Library of the Netherlands (hereafter: KB has formulated and the arguments that support these. In this context we will review public-private partnerships and also the policy of the KB. We will conclude with recommendations for cultural heritage institutions concerning a reuse policy for digitised public domain material.

  14. Domain decomposition methods and parallel computing

    International Nuclear Information System (INIS)

    Meurant, G.

    1991-01-01

    In this paper, we show how to efficiently solve large linear systems on parallel computers. These linear systems arise from discretization of scientific computing problems described by systems of partial differential equations. We show how to get a discrete finite dimensional system from the continuous problem and the chosen conjugate gradient iterative algorithm is briefly described. Then, the different kinds of parallel architectures are reviewed and their advantages and deficiencies are emphasized. We sketch the problems found in programming the conjugate gradient method on parallel computers. For this algorithm to be efficient on parallel machines, domain decomposition techniques are introduced. We give results of numerical experiments showing that these techniques allow a good rate of convergence for the conjugate gradient algorithm as well as computational speeds in excess of a billion of floating point operations per second. (author). 5 refs., 11 figs., 2 tabs., 1 inset

  15. Fuzzy Clustering based Methodology for Multidimensional Data Analysis in Computational Forensic Domain

    OpenAIRE

    Kilian Stoffel; Paul Cotofrei; Dong Han

    2012-01-01

    As interdisciplinary domain requiring advanced and innovative methodologies the computational forensics domain is characterized by data being simultaneously large scaled and uncertain multidimensional and approximate. Forensic domain experts trained to discover hidden pattern from crime data are limited in their analysis without the assistance of a computational intelligence approach. In this paper a methodology and an automatic procedure based on fuzzy set theory and designed to infer precis...

  16. Domain of attraction computation for tumor dynamics

    NARCIS (Netherlands)

    Doban, A.I.; Lazar, M.

    2014-01-01

    In this paper we propose the use of rational Lyapunov functions to estimate the domain of attraction of the tumor dormancy equilibrium of immune cells-malignant cells interaction dynamics. A procedure for computing rational Lyapunov functions is worked out, with focus on obtaining a meaningful

  17. Public-domain software for root image analysis

    Directory of Open Access Journals (Sweden)

    Mirian Cristina Gomes Costa

    2014-10-01

    Full Text Available In the search for high efficiency in root studies, computational systems have been developed to analyze digital images. ImageJ and Safira are public-domain systems that may be used for image analysis of washed roots. However, differences in root properties measured using ImageJ and Safira are supposed. This study compared values of root length and surface area obtained with public-domain systems with values obtained by a reference method. Root samples were collected in a banana plantation in an area of a shallower Typic Carbonatic Haplic Cambisol (CXk, and an area of a deeper Typic Haplic Ta Eutrophic Cambisol (CXve, at six depths in five replications. Root images were digitized and the systems ImageJ and Safira used to determine root length and surface area. The line-intersect method modified by Tennant was used as reference; values of root length and surface area measured with the different systems were analyzed by Pearson's correlation coefficient and compared by the confidence interval and t-test. Both systems ImageJ and Safira had positive correlation coefficients with the reference method for root length and surface area data in CXk and CXve. The correlation coefficient ranged from 0.54 to 0.80, with lowest value observed for ImageJ in the measurement of surface area of roots sampled in CXve. The IC (95 % revealed that root length measurements with Safira did not differ from that with the reference method in CXk (-77.3 to 244.0 mm. Regarding surface area measurements, Safira did not differ from the reference method for samples collected in CXk (-530.6 to 565.8 mm² as well as in CXve (-4231 to 612.1 mm². However, measurements with ImageJ were different from those obtained by the reference method, underestimating length and surface area in samples collected in CXk and CXve. Both ImageJ and Safira allow an identification of increases or decreases in root length and surface area. However, Safira results for root length and surface area are

  18. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    Science.gov (United States)

    Beg, Marijan; Pepper, Ryan A.; Fangohr, Hans

    2017-05-01

    Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i) the re-compilation of source code, (ii) the use of configuration files, (iii) the graphical user interface, and (iv) embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF). We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  19. Error analysis of a public domain pronunciation dictionary

    CSIR Research Space (South Africa)

    Martirosian, O

    2007-11-01

    Full Text Available ], a popular public domain resource that is widely used in English speech processing systems. The techniques being investigated are applied to the lexicon and the results of each step are illustrated using sample entries. The authors found that as many...

  20. Modern EMC analysis I time-domain computational schemes

    CERN Document Server

    Kantartzis, Nikolaos V

    2008-01-01

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of contemporary real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, the analysis covers the theory of the finite-difference time-domain, the transmission-line matrix/modeling, and the finite i

  1. Research Progress of Global Land Domain Service Computing:Take GlobeLand 30 as an Example

    Directory of Open Access Journals (Sweden)

    CHEN Jun

    2017-10-01

    Full Text Available Combining service-computing technology with domain requirements is one of the important development directions of geographic information under Internet+, which provides highly efficient technical means for information sharing and collaborative services. Using GlobeLand 30 as an example, this paper analyzes the basic problems of integrating land cover information processing and service computing, introduces the latest research progress in domain service modeling, online computing method and dynamic service technology, and the GlobeLand 30 information service platform. The paper also discusses the further development directions of GlobeLand 30 domain service computing.

  2. Domain Decomposition: A Bridge between Nature and Parallel Computers

    Science.gov (United States)

    1992-09-01

    B., "Domain Decomposition Algorithms for Indefinite Elliptic Problems," S"IAM Journal of S; cientific and Statistical (’omputing, Vol. 13, 1992, pp...AD-A256 575 NASA Contractor Report 189709 ICASE Report No. 92-44 ICASE DOMAIN DECOMPOSITION: A BRIDGE BETWEEN NATURE AND PARALLEL COMPUTERS DTIC dE...effectively implemented on dis- tributed memory multiprocessors. In 1990 (as reported in Ref. 38 using the tile algo- rithm), a 103,201-unknown 2D elliptic

  3. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    Directory of Open Access Journals (Sweden)

    Marijan Beg

    2017-05-01

    Full Text Available Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i the re-compilation of source code, (ii the use of configuration files, (iii the graphical user interface, and (iv embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF. We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  4. Domain analysis of computational science - Fifty years of a scientific computing group

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, M.

    2010-02-23

    I employed bibliometric- and historical-methods to study the domain of the Scientific Computing group at Brookhaven National Laboratory (BNL) for an extended period of fifty years, from 1958 to 2007. I noted and confirmed the growing emergence of interdisciplinarity within the group. I also identified a strong, consistent mathematics and physics orientation within it.

  5. Two-phase flow steam generator simulations on parallel computers using domain decomposition method

    International Nuclear Information System (INIS)

    Belliard, M.

    2003-01-01

    Within the framework of the Domain Decomposition Method (DDM), we present industrial steady state two-phase flow simulations of PWR Steam Generators (SG) using iteration-by-sub-domain methods: standard and Adaptive Dirichlet/Neumann methods (ADN). The averaged mixture balance equations are solved by a Fractional-Step algorithm, jointly with the Crank-Nicholson scheme and the Finite Element Method. The algorithm works with overlapping or non-overlapping sub-domains and with conforming or nonconforming meshing. Computations are run on PC networks or on massively parallel mainframe computers. A CEA code-linker and the PVM package are used (master-slave context). SG mock-up simulations, involving up to 32 sub-domains, highlight the efficiency (speed-up, scalability) and the robustness of the chosen approach. With the DDM, the computational problem size is easily increased to about 1,000,000 cells and the CPU time is significantly reduced. The difficulties related to industrial use are also discussed. (author)

  6. Towards development of a high quality public domain global roads database

    Directory of Open Access Journals (Sweden)

    Andrew Nelson

    2006-12-01

    Full Text Available There is clear demand for a global spatial public domain roads data set with improved geographic and temporal coverage, consistent coding of road types, and clear documentation of sources. The currently best available global public domain product covers only one-quarter to one-third of the existing road networks, and this varies considerably by region. Applications for such a data set span multiple sectors and would be particularly valuable for the international economic development, disaster relief, and biodiversity conservation communities, not to mention national and regional agencies and organizations around the world. The building blocks for such a global product are available for many countries and regions, yet thus far there has been neither strategy nor leadership for developing it. This paper evaluates the best available public domain and commercial data sets, assesses the gaps in global coverage, and proposes a number of strategies for filling them. It also identifies stakeholder organizations with an interest in such a data set that might either provide leadership or funding for its development. It closes with a proposed set of actions to begin the process.

  7. Assessment of current cybersecurity practices in the public domain : cyber indications and warnings domain.

    Energy Technology Data Exchange (ETDEWEB)

    Hamlet, Jason R.; Keliiaa, Curtis M.

    2010-09-01

    This report assesses current public domain cyber security practices with respect to cyber indications and warnings. It describes cybersecurity industry and government activities, including cybersecurity tools, methods, practices, and international and government-wide initiatives known to be impacting current practice. Of particular note are the U.S. Government's Trusted Internet Connection (TIC) and 'Einstein' programs, which are serving to consolidate the Government's internet access points and to provide some capability to monitor and mitigate cyber attacks. Next, this report catalogs activities undertaken by various industry and government entities. In addition, it assesses the benchmarks of HPC capability and other HPC attributes that may lend themselves to assist in the solution of this problem. This report draws few conclusions, as it is intended to assess current practice in preparation for future work, however, no explicit references to HPC usage for the purpose of analyzing cyber infrastructure in near-real-time were found in the current practice. This report and a related SAND2010-4766 National Cyber Defense High Performance Computing and Analysis: Concepts, Planning and Roadmap report are intended to provoke discussion throughout a broad audience about developing a cohesive HPC centric solution to wide-area cybersecurity problems.

  8. Repetitive Domain-Referenced Testing Using Computers: the TITA System.

    Science.gov (United States)

    Olympia, P. L., Jr.

    The TITA (Totally Interactive Testing and Analysis) System algorithm for the repetitive construction of domain-referenced tests utilizes a compact data bank, is highly portable, is useful in any discipline, requires modest computer hardware, and does not present a security problem. Clusters of related keyphrases, statement phrases, and distractors…

  9. New Flutter Analysis Technique for Time-Domain Computational Aeroelasticity

    Science.gov (United States)

    Pak, Chan-Gi; Lung, Shun-Fat

    2017-01-01

    A new time-domain approach for computing flutter speed is presented. Based on the time-history result of aeroelastic simulation, the unknown unsteady aerodynamics model is estimated using a system identification technique. The full aeroelastic model is generated via coupling the estimated unsteady aerodynamic model with the known linear structure model. The critical dynamic pressure is computed and used in the subsequent simulation until the convergence of the critical dynamic pressure is achieved. The proposed method is applied to a benchmark cantilevered rectangular wing.

  10. Time Domain Terahertz Axial Computed Tomography Non Destructive Evaluation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to demonstrate key elements of feasibility for a high speed automated time domain terahertz computed axial tomography (TD-THz CT) non destructive...

  11. The Domain Shared by Computational and Digital Ontology: A Phenomenological Exploration and Analysis

    Science.gov (United States)

    Compton, Bradley Wendell

    2009-01-01

    The purpose of this dissertation is to explore and analyze a domain of research thought to be shared by two areas of philosophy: computational and digital ontology. Computational ontology is philosophy used to develop information systems also called computational ontologies. Digital ontology is philosophy dealing with our understanding of Being…

  12. Time Domain Terahertz Axial Computed Tomography Non Destructive Evaluation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — In this Phase 2 project, we propose to develop, construct, and deliver to NASA a computed axial tomography time-domain terahertz (CT TD-THz) non destructive...

  13. Computational design of binding proteins to EGFR domain II.

    Directory of Open Access Journals (Sweden)

    Yoon Sup Choi

    Full Text Available We developed a process to produce novel interactions between two previously unrelated proteins. This process selects protein scaffolds and designs protein interfaces that bind to a surface patch of interest on a target protein. Scaffolds with shapes complementary to the target surface patch were screened using an exhaustive computational search of the human proteome and optimized by directed evolution using phage display. This method was applied to successfully design scaffolds that bind to epidermal growth factor receptor (EGFR domain II, the interface of EGFR dimerization, with high reactivity toward the target surface patch of EGFR domain II. One potential application of these tailor-made protein interactions is the development of therapeutic agents against specific protein targets.

  14. A Public Domain Software Library for Reading and Language Arts.

    Science.gov (United States)

    Balajthy, Ernest

    A three-year project carried out by the Microcomputers and Reading Committee of the New Jersey Reading Association involved the collection, improvement, and distribution of free microcomputer software (public domain programs) designed to deal with reading and writing skills. Acknowledging that this free software is not without limitations (poor…

  15. Computer-Based Simulation Games in Public Administration Education

    OpenAIRE

    Kutergina Evgeniia

    2017-01-01

    Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently...

  16. Scalable domain decomposition solvers for stochastic PDEs in high performance computing

    International Nuclear Information System (INIS)

    Desai, Ajit; Pettit, Chris; Poirel, Dominique; Sarkar, Abhijit

    2017-01-01

    Stochastic spectral finite element models of practical engineering systems may involve solutions of linear systems or linearized systems for non-linear problems with billions of unknowns. For stochastic modeling, it is therefore essential to design robust, parallel and scalable algorithms that can efficiently utilize high-performance computing to tackle such large-scale systems. Domain decomposition based iterative solvers can handle such systems. And though these algorithms exhibit excellent scalabilities, significant algorithmic and implementational challenges exist to extend them to solve extreme-scale stochastic systems using emerging computing platforms. Intrusive polynomial chaos expansion based domain decomposition algorithms are extended here to concurrently handle high resolution in both spatial and stochastic domains using an in-house implementation. Sparse iterative solvers with efficient preconditioners are employed to solve the resulting global and subdomain level local systems through multi-level iterative solvers. We also use parallel sparse matrix–vector operations to reduce the floating-point operations and memory requirements. Numerical and parallel scalabilities of these algorithms are presented for the diffusion equation having spatially varying diffusion coefficient modeled by a non-Gaussian stochastic process. Scalability of the solvers with respect to the number of random variables is also investigated.

  17. Bringing computational science to the public.

    Science.gov (United States)

    McDonagh, James L; Barker, Daniel; Alderson, Rosanna G

    2016-01-01

    The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.

  18. Computational electrodynamics the finite-difference time-domain method

    CERN Document Server

    Taflove, Allen

    2005-01-01

    This extensively revised and expanded third edition of the Artech House bestseller, Computational Electrodynamics: The Finite-Difference Time-Domain Method, offers engineers the most up-to-date and definitive resource on this critical method for solving Maxwell's equations. The method helps practitioners design antennas, wireless communications devices, high-speed digital and microwave circuits, and integrated optical devices with unsurpassed efficiency. There has been considerable advancement in FDTD computational technology over the past few years, and the third edition brings professionals the very latest details with entirely new chapters on important techniques, major updates on key topics, and new discussions on emerging areas such as nanophotonics. What's more, to supplement the third edition, the authors have created a Web site with solutions to problems, downloadable graphics and videos, and updates, making this new edition the ideal textbook on the subject as well.

  19. Accumulation of Domain-Specific Physical Inactivity and Presence of Hypertension in Brazilian Public Healthcare System.

    Science.gov (United States)

    Turi, Bruna Camilo; Codogno, Jamile S; Fernandes, Romulo A; Sui, Xuemei; Lavie, Carl J; Blair, Steven N; Monteiro, Henrique Luiz

    2015-11-01

    Hypertension is one of the most common noncommunicable diseases worldwide, and physical inactivity is a risk factor predisposing to its occurrence and complications. However, it is still unclear the association between physical inactivity domains and hypertension, especially in public healthcare systems. Thus, this study aimed to investigate the association between physical inactivity aggregation in different domains and prevalence of hypertension among users of Brazilian public health system. 963 participants composed the sample. Subjects were divided into quartiles groups according to 3 different domains of physical activity (occupational; physical exercises; and leisure-time and transportation). Hypertension was based on physician diagnosis. Physical inactivity in occupational domain was significantly associated with higher prevalence of hypertension (OR = 1.52 [1.05 to 2.21]). The same pattern occurred for physical inactivity in leisure-time (OR = 1.63 [1.11 to 2.39]) and aggregation of physical inactivity in 3 domains (OR = 2.46 [1.14 to 5.32]). However, the multivariate-adjusted model showed significant association between hypertension and physical inactivity in 3 domains (OR = 2.57 [1.14 to 5.79]). The results suggest an unequal prevalence of hypertension according to physical inactivity across different domains and increasing the promotion of physical activity in the healthcare system is needed.

  20. Role of Soft Computing Approaches in HealthCare Domain: A Mini Review.

    Science.gov (United States)

    Gambhir, Shalini; Malik, Sanjay Kumar; Kumar, Yugal

    2016-12-01

    In the present era, soft computing approaches play a vital role in solving the different kinds of problems and provide promising solutions. Due to popularity of soft computing approaches, these approaches have also been applied in healthcare data for effectively diagnosing the diseases and obtaining better results in comparison to traditional approaches. Soft computing approaches have the ability to adapt itself according to problem domain. Another aspect is a good balance between exploration and exploitation processes. These aspects make soft computing approaches more powerful, reliable and efficient. The above mentioned characteristics make the soft computing approaches more suitable and competent for health care data. The first objective of this review paper is to identify the various soft computing approaches which are used for diagnosing and predicting the diseases. Second objective is to identify various diseases for which these approaches are applied. Third objective is to categories the soft computing approaches for clinical support system. In literature, it is found that large number of soft computing approaches have been applied for effectively diagnosing and predicting the diseases from healthcare data. Some of these are particle swarm optimization, genetic algorithm, artificial neural network, support vector machine etc. A detailed discussion on these approaches are presented in literature section. This work summarizes various soft computing approaches used in healthcare domain in last one decade. These approaches are categorized in five different categories based on the methodology, these are classification model based system, expert system, fuzzy and neuro fuzzy system, rule based system and case based system. Lot of techniques are discussed in above mentioned categories and all discussed techniques are summarized in the form of tables also. This work also focuses on accuracy rate of soft computing technique and tabular information is provided for

  1. 77 FR 4568 - Annual Computational Science Symposium; Public Conference

    Science.gov (United States)

    2012-01-30

    ...] Annual Computational Science Symposium; Public Conference AGENCY: Food and Drug Administration, HHS... with the Pharmaceutical Users Software Exchange (PhUSE), is announcing a public conference entitled ``The FDA/PhUSE Annual Computational Science Symposium.'' The purpose of the conference is to help the...

  2. Language Choice and Use of Malaysian Public University Lecturers in the Education Domain

    Directory of Open Access Journals (Sweden)

    Tam Lee Mei

    2016-02-01

    Full Text Available It is a norm for people from a multilingual and multicultural country such as Malaysia to speak at least two or more languages. Thus, the Malaysian multilingual situation resulted in speakers having to make decisions about which languages are to be used for different purposes in different domains. In order to explain the phenomenon of language choice, Fishman domain analysis (1964 was adapted into this research. According to Fishman’s domain analysis, language choice and use may depend on the speaker’s experiences situated in different settings, different language repertoires that are available to the speaker, different interlocutors and different topics. Such situations inevitably cause barriers and difficulties to those professionals who work in the education domain. Therefore, the purpose of this research is to explore the language choice and use of Malaysian public university lecturers in the education domain and to investigate whether any significant differences exist between ethnicity and field of study with the English language choice and use of the lecturers. 200 survey questionnaires were distributed to examine the details of the lecturers’ language choice and use. The findings of this research reveal that all of the respondents generally preferred to choose and use English language in both formal and informal education domain. Besides, all of the respondents claimed that they chose and used more than one language. It is also found that ethnicity and field of study of the respondents influence the language choice and use in the education domain. In addition, this research suggested that the language and educational policy makers have been largely successful in raising the role and status of the English language as the medium of instruction in tertiary education while maintaining the Malay language as having an important role in the communicative acts, thus characterizing the lecturers’ language choice and use. Keywords: Language

  3. Computer-Based Simulation Games in Public Administration Education

    Directory of Open Access Journals (Sweden)

    Kutergina Evgeniia

    2017-12-01

    Full Text Available Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently in Russia the use of computer-based simulation games in Master of Public Administration (MPA curricula is quite limited. Th is paper focuses on computer- based simulation games for students of MPA programmes. Our aim was to analyze outcomes of implementing such games in MPA curricula. We have done so by (1 developing three computer-based simulation games about allocating public finances, (2 testing the games in the learning process, and (3 conducting a posttest examination to evaluate the effect of simulation games on students’ knowledge of municipal finances. Th is study was conducted in the National Research University Higher School of Economics (HSE and in the Russian Presidential Academy of National Economy and Public Administration (RANEPA during the period of September to December 2015, in Saint Petersburg, Russia. Two groups of students were randomly selected in each university and then randomly allocated either to the experimental or the control group. In control groups (n=12 in HSE, n=13 in RANEPA students had traditional lectures. In experimental groups (n=12 in HSE, n=13 in RANEPA students played three simulation games apart from traditional lectures. Th is exploratory research shows that the use of computer-based simulation games in MPA curricula can improve students’ outcomes by 38 %. In general, the experimental groups had better performances on the post-test examination (Figure 2. Students in the HSE experimental group had 27.5 % better

  4. Concrete domains

    OpenAIRE

    Kahn, G.; Plotkin, G.D.

    1993-01-01

    This paper introduces the theory of a particular kind of computation domains called concrete domains. The purpose of this theory is to find a satisfactory framework for the notions of coroutine computation and sequentiality of evaluation.

  5. Blockchain-based Public Key Infrastructure for Inter-Domain Secure Routing

    OpenAIRE

    de la Rocha Gómez-Arevalillo , Alfonso; Papadimitratos , Panos

    2017-01-01

    International audience; A gamut of secure inter-domain routing protocols has been proposed in the literature. They use traditional PGP-like and centralized Public Key Infrastructures for trust management. In this paper, we propose our alternative approach for managing security associations, Secure Blockchain Trust Management (SBTM), a trust management system that instantiates a blockchain-based PKI for the operation of securerouting protocols. A main motivation for SBTM is to facilitate gradu...

  6. Remotely Piloted Aircraft and War in the Public Relations Domain

    Science.gov (United States)

    2014-10-01

    the terms as they appear in quoted texts. 2. Peter Kreeft, Socratic Logic: A Logic Text Using Socratic Method , Platonic Questions, and Aristotelian...Ronald Brooks.22 This method of refuting an argu- ment reflects option C (above), demonstrating that the conclusion does not follow from the premises...and War in the Public Relations Domain Feature tional Security Assistance Force (ISAF) met to discuss methods of elim- inating civilian casualties in

  7. Public computing options for individuals with cognitive impairments: survey outcomes.

    Science.gov (United States)

    Fox, Lynn Elizabeth; Sohlberg, McKay Moore; Fickas, Stephen; Lemoncello, Rik; Prideaux, Jason

    2009-09-01

    To examine availability and accessibility of public computing for individuals with cognitive impairment (CI) who reside in the USA. A telephone survey was administered as a semi-structured interview to 145 informants representing seven types of public facilities across three geographically distinct regions using a snowball sampling technique. An Internet search of wireless (Wi-Fi) hotspots supplemented the survey. Survey results showed the availability of public computer terminals and Internet hotspots was greatest in the urban sample, followed by the mid-sized and rural cities. Across seven facility types surveyed, libraries had the highest percentage of access barriers, including complex queue procedures, login and password requirements, and limited technical support. University assistive technology centres and facilities with a restricted user policy, such as brain injury centres, had the lowest incidence of access barriers. Findings suggest optimal outcomes for people with CI will result from a careful match of technology and the user that takes into account potential barriers and opportunities to computing in an individual's preferred public environments. Trends in public computing, including the emergence of widespread Wi-Fi and limited access to terminals that permit auto-launch applications, should guide development of technology designed for use in public computing environments.

  8. Computational Science: Ensuring America's Competitiveness

    National Research Council Canada - National Science Library

    Reed, Daniel A; Bajcsy, Ruzena; Fernandez, Manuel A; Griffiths, Jose-Marie; Mott, Randall D; Dongarra, J. J; Johnson, Chris R; Inouye, Alan S; Miner, William; Matzke, Martha K; Ponick, Terry L

    2005-01-01

    Computational science is now indispensable to the solution of complex problems in every sector, from traditional science and engineering domains to such key areas as national security, public health...

  9. Computational models of music perception and cognition II: Domain-specific music processing

    Science.gov (United States)

    Purwins, Hendrik; Grachten, Maarten; Herrera, Perfecto; Hazan, Amaury; Marxer, Ricard; Serra, Xavier

    2008-09-01

    In Part I [Purwins H, Herrera P, Grachten M, Hazan A, Marxer R, Serra X. Computational models of music perception and cognition I: The perceptual and cognitive processing chain. Physics of Life Reviews 2008, in press, doi:10.1016/j.plrev.2008.03.004], we addressed the study of cognitive processes that underlie auditory perception of music, and their neural correlates. The aim of the present paper is to summarize empirical findings from music cognition research that are relevant to three prominent music theoretic domains: rhythm, melody, and tonality. Attention is paid to how cognitive processes like category formation, stimulus grouping, and expectation can account for the music theoretic key concepts in these domains, such as beat, meter, voice, consonance. We give an overview of computational models that have been proposed in the literature for a variety of music processing tasks related to rhythm, melody, and tonality. Although the present state-of-the-art in computational modeling of music cognition definitely provides valuable resources for testing specific hypotheses and theories, we observe the need for models that integrate the various aspects of music perception and cognition into a single framework. Such models should be able to account for aspects that until now have only rarely been addressed in computational models of music cognition, like the active nature of perception and the development of cognitive capacities from infancy to adulthood.

  10. Public Services 2.0: The Impact of Social Computing on Public Services

    NARCIS (Netherlands)

    Huijboom, Noor; Broek, Tijs Van Den; Frissen, Valerie; Kool, Linda; Kotterink, Bas; Nielsen, Morten Meyerhoff; Millard, Jeremy

    2009-01-01

    The report gives an overview of the main trends of Social Computing, in the wider context of an evolving public sector, and in relation to relevant government trends and normative policy visions within and across EU Member States on future public services. It then provides an exhaustive literature

  11. Frequency Domain Computer Programs for Prediction and Analysis of Rail Vehicle Dynamics : Volume 1. Technical Report

    Science.gov (United States)

    1975-12-01

    Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume I defines the general analytical capabilities required for computer programs applicable to single rail vehi...

  12. Parallel computing of a climate model on the dawn 1000 by domain decomposition method

    Science.gov (United States)

    Bi, Xunqiang

    1997-12-01

    In this paper the parallel computing of a grid-point nine-level atmospheric general circulation model on the Dawn 1000 is introduced. The model was developed by the Institute of Atmospheric Physics (IAP), Chinese Academy of Sciences (CAS). The Dawn 1000 is a MIMD massive parallel computer made by National Research Center for Intelligent Computer (NCIC), CAS. A two-dimensional domain decomposition method is adopted to perform the parallel computing. The potential ways to increase the speed-up ratio and exploit more resources of future massively parallel supercomputation are also discussed.

  13. Assessing water availability over peninsular Malaysia using public domain satellite data products

    International Nuclear Information System (INIS)

    Ali, M I; Hashim, M; Zin, H S M

    2014-01-01

    Water availability monitoring is an essential task for water resource sustainability and security. In this paper, the assessment of satellite remote sensing technique for determining water availability is reported. The water-balance analysis is used to compute the spatio-temporal water availability with main inputs; the precipitation and actual evapotranspiration rate (AET), both fully derived from public-domain satellite products of Tropical Rainfall Measurement Mission (TRMM) and MODIS, respectively. Both these satellite products were first subjected to calibration to suit corresponding selected local precipitation and AET samples. Multi-temporal data sets acquired 2000-2010 were used in this study. The results of study, indicated strong agreement of monthly water availability with the basin flow rate (r 2 = 0.5, p < 0.001). Similar agreements were also noted between the estimated annual average water availability with the in-situ measurement. It is therefore concluded that the method devised in this study provide a new alternative for water availability mapping over large area, hence offers the only timely and cost-effective method apart from providing comprehensive spatio-temporal patterns, crucial in water resource planning to ensure water security

  14. Domain decomposition method for solving elliptic problems in unbounded domains

    International Nuclear Information System (INIS)

    Khoromskij, B.N.; Mazurkevich, G.E.; Zhidkov, E.P.

    1991-01-01

    Computational aspects of the box domain decomposition (DD) method for solving boundary value problems in an unbounded domain are discussed. A new variant of the DD-method for elliptic problems in unbounded domains is suggested. It is based on the partitioning of an unbounded domain adapted to the given asymptotic decay of an unknown function at infinity. The comparison of computational expenditures is given for boundary integral method and the suggested DD-algorithm. 29 refs.; 2 figs.; 2 tabs

  15. Suburban development – a search for public domains in Danish suburban neighbourhoods

    DEFF Research Database (Denmark)

    Melgaard, Bente; Bech-Danielsen, Claus

    These years some of the post-war Danish suburbs are facing great challenges – social segregation, demographic changes and challenges in building technology. In particular, segregation prevents social life from unfolding across social, economic and cultural borders. Therefore, in this paper......, potentials for bridge-building across the enclaves of the suburb are looked for through a combined architectural-anthropological mapping of public spaces in a specific suburb in Denmark, the analyses being carried out in the light of Hajer & Reijndorp’s definition of public domains and the term exchange...

  16. The Value of Privacy and Surveillance Drones in the Public Domain : Scrutinizing the Dutch Flexible Deployment of Mobile Cameras Act

    NARCIS (Netherlands)

    Gerdo Kuiper; Quirine Eijkman

    2017-01-01

    The flexible deployment of drones in the public domain, is in this article assessed from a legal philosophical perspective. On the basis of theories of Dworkin and Moore the distinction between individual rights and collective security policy goals is discussed. Mobile cameras in the public domain

  17. Agents unleashed a public domain look at agent technology

    CERN Document Server

    Wayner, Peter

    1995-01-01

    Agents Unleashed: A Public Domain Look at Agent Technology covers details of building a secure agent realm. The book discusses the technology for creating seamlessly integrated networks that allow programs to move from machine to machine without leaving a trail of havoc; as well as the technical details of how an agent will move through the network, prove its identity, and execute its code without endangering the host. The text also describes the organization of the host's work processing an agent; error messages, bad agent expulsion, and errors in XLISP-agents; and the simulators of errors, f

  18. The Computer Revolution and Physical Chemistry.

    Science.gov (United States)

    O'Brien, James F.

    1989-01-01

    Describes laboratory-oriented software programs that are short, time-saving, eliminate computational errors, and not found in public domain courseware. Program availability for IBM and Apple microcomputers is included. (RT)

  19. The International River Interface Cooperative: Public Domain Software for River Flow and Morphodynamics (Invited)

    Science.gov (United States)

    Nelson, J. M.; Shimizu, Y.; McDonald, R.; Takebayashi, H.

    2009-12-01

    The International River Interface Cooperative is an informal organization made up of academic faculty and government scientists with the goal of developing, distributing and providing education for a public-domain software interface for modeling river flow and morphodynamics. Formed in late 2007, the group released the first version of this interface (iRIC) in late 2009. iRIC includes models for two and three-dimensional flow, sediment transport, bed evolution, groundwater-surface water interaction, topographic data processing, and habitat assessment, as well as comprehensive data and model output visualization, mapping, and editing tools. All the tools in iRIC are specifically designed for use in river reaches and utilize common river data sets. The models are couched within a single graphical user interface so that a broad spectrum of models are available to users without learning new pre- and post-processing tools. The first version of iRIC was developed by combining the USGS public-domain Multi-Dimensional Surface Water Modeling System (MD_SWMS), developed at the USGS Geomorphology and Sediment Transport Laboratory in Golden, Colorado, with the public-domain river modeling code NAYS developed by the Universities of Hokkaido and Kyoto, Mizuho Corporation, and the Foundation of the River Disaster Prevention Research Institute in Sapporo, Japan. Since this initial effort, other Universities and Agencies have joined the group, and the interface has been expanded to allow users to integrate their own modeling code using Executable Markup Language (XML), which provides easy access and expandability to the iRIC software interface. In this presentation, the current components of iRIC are described and results from several practical modeling applications are presented to illustrate the capabilities and flexibility of the software. In addition, some future extensions to iRIC are demonstrated, including software for Lagrangian particle tracking and the prediction of

  20. Domains of State-Owned, Privately Held, and Publicly Traded Firms in International Competition.

    Science.gov (United States)

    Mascarenhas, Briance

    1989-01-01

    Hypotheses relating ownership to domain differences among state-owned, publicly traded, and privately held firms in international competition were examined in a controlled field study of the offshore drilling industry. Ownership explained selected differences in domestic market dominance, international presence, and customer orientation, even…

  1. Code and papers: computing publication patterns in the LHC era

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Publications in scholarly journals establish the body of knowledge deriving from scientific research; they also play a fundamental role in the career path of scientists and in the evaluation criteria of funding agencies. This presentation reviews the evolution of computing-oriented publications in HEP following the start of operation of LHC. Quantitative analyses are illustrated, which document the production of scholarly papers on computing-related topics by HEP experiments and core tools projects (including distributed computing R&D), and the citations they receive. Several scientometric indicators are analyzed to characterize the role of computing in HEP literature. Distinctive features of scholarly publication production in the software-oriented and hardware-oriented experimental HEP communities are highlighted. Current patterns and trends are compared to the situation in previous generations' HEP experiments at LEP, Tevatron and B-factories. The results of this scientometric analysis document objec...

  2. On Stability of Exact Transparent Boundary Condition for the Parabolic Equation in Rectangular Computational Domain

    Science.gov (United States)

    Feshchenko, R. M.

    Recently a new exact transparent boundary condition (TBC) for the 3D parabolic wave equation (PWE) in rectangular computational domain was derived. However in the obtained form it does not appear to be unconditionally stable when used with, for instance, the Crank-Nicolson finite-difference scheme. In this paper two new formulations of the TBC for the 3D PWE in rectangular computational domain are reported, which are likely to be unconditionally stable. They are based on an unconditionally stable fully discrete TBC for the Crank-Nicolson scheme for the 2D PWE. These new forms of the TBC can be used for numerical solution of the 3D PWE when a higher precision is required.

  3. DATABASES AND THE SUI-GENERIS RIGHT – PROTECTION OUTSIDE THE ORIGINALITY. THE DISREGARD OF THE PUBLIC DOMAIN

    Directory of Open Access Journals (Sweden)

    Monica LUPAȘCU

    2018-05-01

    Full Text Available This study focuses on databases as they are regulated by Directive no.96/9/EC regarding the protection of databases. There are also several references to Romanian Law no.8/1996 on copyright and neighbouring rights which implements the mentioned European Directive. The study analyses certain effects that the sui-generis protection has on public domain. The study tries to demonstrate that the reglementation specific to databases neglects the interests correlated with the public domain. The effect of such a regulation is the abusive creation of some databases in which the public domain (meaning information not protected by copyright such as news, ideas, procedures, methods, systems, processes, concepts, principles, discoveries ends up being encapsulated and made available only to some private interests, the access to public domain being regulated indirectly. The study begins by explaining the sui- generis right and its origin. The first mention of databases can be found in “Green Paper on Copyright (1998,” a document that clearly shows, the database protection was thought to cover a sphere of information non-protectable from the scientific and industrial fields. Several arguments are made by the author, most of them based on the report of the Public Consultation sustained in 2014 in regards to the necessity of the sui-generis right. There are some references made to a specific case law, namely British Houseracing Board vs William Hill and Fixture Marketing Ldt. The ECJ’s decision în that case is of great importance for the support of public interest to access information corresponding to some restrictive fields that are derived as a result of the maker’s activities, because in the absence of the sui-generis right, all this information can be freely accessed and used.

  4. The complexity of changes in the domain of managing public expenditures

    Directory of Open Access Journals (Sweden)

    Dimitrijević Marina

    2016-01-01

    Full Text Available Public expenditures are a huge problem in contemporary states. In the conditions of a global economic crisis and the circumstances involving high level of citizen dissatisfaction related to the former methods of funding and managing the public sector (reflected in ruining the funding sources, irrational spending of public expenditure funds, increase in the budget deficit and the level of public debt, the changes in the domain of managing public expenditures have become a priority. By their nature, these changes are complex and long-lasting, and they should bring significant improvements in the field of public expenditure; they have to provide for lawful and purposeful spending of public funds. It is expected to lower the needed public incomes for financing public expenditure, to improve production and competition in the market economy, and to increase personal consumption, living standard and the quality of life of the population. Regardless of the social, economic, legal or political environment in each of state, the topical issue of reforming the management of public expenditures seems to imply a return to a somewhat neglected need for the public sector to function within its own financial possibilities. The state modernisation processes and advancement in the process of managing public expenditures call for a realistic evaluation of the existing condition and circumstances in which these processes occur, as well as the assessment of potential and actual risks that may hinder their effectiveness. Otherwise, it seems that the establishment of a significant level of responsibility in spending the budget funds and a greater transparency of public expenditure may be far-fetched goals.

  5. Combating Identity Fraud in the Public Domain: Information Strategies for Healthcare and Criminal Justice

    NARCIS (Netherlands)

    Plomp, M.G.A.; Grijpink, J.H.A.M.

    2011-01-01

    Two trends are present in both the private and public domain: increasing interorganisational co-operation and increasing digitisation. Nowadays, more and more processes within and between organisations take place electronically. These developments are visible on local, national and European scale.

  6. Parallel Implementation of Triangular Cellular Automata for Computing Two-Dimensional Elastodynamic Response on Arbitrary Domains

    Science.gov (United States)

    Leamy, Michael J.; Springer, Adam C.

    In this research we report parallel implementation of a Cellular Automata-based simulation tool for computing elastodynamic response on complex, two-dimensional domains. Elastodynamic simulation using Cellular Automata (CA) has recently been presented as an alternative, inherently object-oriented technique for accurately and efficiently computing linear and nonlinear wave propagation in arbitrarily-shaped geometries. The local, autonomous nature of the method should lead to straight-forward and efficient parallelization. We address this notion on symmetric multiprocessor (SMP) hardware using a Java-based object-oriented CA code implementing triangular state machines (i.e., automata) and the MPI bindings written in Java (MPJ Express). We use MPJ Express to reconfigure our existing CA code to distribute a domain's automata to cores present on a dual quad-core shared-memory system (eight total processors). We note that this message passing parallelization strategy is directly applicable to computer clustered computing, which will be the focus of follow-on research. Results on the shared memory platform indicate nearly-ideal, linear speed-up. We conclude that the CA-based elastodynamic simulator is easily configured to run in parallel, and yields excellent speed-up on SMP hardware.

  7. The Observation of Bahasa Indonesia Official Computer Terms Implementation in Scientific Publication

    Science.gov (United States)

    Gunawan, D.; Amalia, A.; Lydia, M. S.; Muthaqin, M. I.

    2018-03-01

    The government of the Republic of Indonesia had issued a regulation to substitute computer terms in foreign language that have been used earlier into official computer terms in Bahasa Indonesia. This regulation was stipulated in Presidential Decree No. 2 of 2001 concerning the introduction of official computer terms in Bahasa Indonesia (known as Senarai Padanan Istilah/SPI). After sixteen years, people of Indonesia, particularly for academics, should have implemented the official computer terms in their official publications. This observation is conducted to discover the implementation of official computer terms usage in scientific publications which are written in Bahasa Indonesia. The data source used in this observation are the publications by the academics, particularly in computer science field. The method used in the observation is divided into four stages. The first stage is metadata harvesting by using Open Archive Initiative - Protocol for Metadata Harvesting (OAI-PMH). Second, converting the harvested document (in pdf format) to plain text. The third stage is text-preprocessing as the preparation of string matching. Then the final stage is searching the official computer terms based on 629 SPI terms by using Boyer-Moore algorithm. We observed that there are 240,781 foreign computer terms in 1,156 scientific publications from six universities. This result shows that the foreign computer terms are still widely used by the academics.

  8. Study of basic computer competence among public health nurses in Taiwan.

    Science.gov (United States)

    Yang, Kuei-Feng; Yu, Shu; Lin, Ming-Sheng; Hsu, Chia-Ling

    2004-03-01

    Rapid advances in information technology and media have made distance learning on the Internet possible. This new model of learning allows greater efficiency and flexibility in knowledge acquisition. Since basic computer competence is a prerequisite for this new learning model, this study was conducted to examine the basic computer competence of public health nurses in Taiwan and explore factors influencing computer competence. A national cross-sectional randomized study was conducted with 329 public health nurses. A questionnaire was used to collect data and was delivered by mail. Results indicate that basic computer competence of public health nurses in Taiwan is still needs to be improved (mean = 57.57 +- 2.83, total score range from 26-130). Among the five most frequently used software programs, nurses were most knowledgeable about Word and least knowledgeable about PowerPoint. Stepwise multiple regression analysis revealed eight variables (weekly number of hours spent online at home, weekly amount of time spent online at work, weekly frequency of computer use at work, previous computer training, computer at workplace and Internet access, job position, education level, and age) that significantly influenced computer competence, which accounted for 39.0 % of the variance. In conclusion, greater computer competence, broader educational programs regarding computer technology, and a greater emphasis on computers at work are necessary to increase the usefulness of distance learning via the Internet in Taiwan. Building a user-friendly environment is important in developing this new media model of learning for the future.

  9. Logic and memory concepts for all-magnetic computing based on transverse domain walls

    International Nuclear Information System (INIS)

    Vandermeulen, J; Van de Wiele, B; Dupré, L; Van Waeyenberge, B

    2015-01-01

    We introduce a non-volatile digital logic and memory concept in which the binary data is stored in the transverse magnetic domain walls present in in-plane magnetized nanowires with sufficiently small cross sectional dimensions. We assign the digital bit to the two possible orientations of the transverse domain wall. Numerical proofs-of-concept are presented for a NOT-, AND- and OR-gate, a FAN-out as well as a reading and writing device. Contrary to the chirality based vortex domain wall logic gates introduced in Omari and Hayward (2014 Phys. Rev. Appl. 2 044001), the presented concepts remain applicable when miniaturized and are driven by electrical currents, making the technology compatible with the in-plane racetrack memory concept. The individual devices can be easily combined to logic networks working with clock speeds that scale linearly with decreasing design dimensions. This opens opportunities to an all-magnetic computing technology where the digital data is stored and processed under the same magnetic representation. (paper)

  10. Computer self efficacy as correlate of on-line public access ...

    African Journals Online (AJOL)

    The use of Online Public Access Catalogue (OPAC) by students has a lot of advantages and computer self-efficacy is a factor that could determine its effective utilization. Little appears to be known about colleges of education students‟ use of OPAC, computer self-efficacy and the relationship between OPAC and computer ...

  11. Cloud Computing for Complex Performance Codes.

    Energy Technology Data Exchange (ETDEWEB)

    Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Klein, Brandon Thorin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miner, John Gifford [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  12. Domain Engineering

    Science.gov (United States)

    Bjørner, Dines

    Before software can be designed we must know its requirements. Before requirements can be expressed we must understand the domain. So it follows, from our dogma, that we must first establish precise descriptions of domains; then, from such descriptions, “derive” at least domain and interface requirements; and from those and machine requirements design the software, or, more generally, the computing systems.

  13. Computer-Assisted Management of Instruction in Veterinary Public Health

    Science.gov (United States)

    Holt, Elsbeth; And Others

    1975-01-01

    Reviews a course in Food Hygiene and Public Health at the University of Illinois College of Veterinary Medicine in which students are sequenced through a series of computer-based lessons or autotutorial slide-tape lessons, the computer also being used to route, test, and keep records. Since grades indicated mastery of the subject, the course will…

  14. Time-domain modeling of electromagnetic diffusion with a frequency-domain code

    NARCIS (Netherlands)

    Mulder, W.A.; Wirianto, M.; Slob, E.C.

    2007-01-01

    We modeled time-domain EM measurements of induction currents for marine and land applications with a frequency-domain code. An analysis of the computational complexity of a number of numerical methods shows that frequency-domain modeling followed by a Fourier transform is an attractive choice if a

  15. Development of Computer Science Disciplines - A Social Network Analysis Approach

    OpenAIRE

    Pham, Manh Cuong; Klamma, Ralf; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and ...

  16. How You Can Protect Public Access Computers "and" Their Users

    Science.gov (United States)

    Huang, Phil

    2007-01-01

    By providing the public with online computing facilities, librarians make available a world of information resources beyond their traditional print materials. Internet-connected computers in libraries greatly enhance the opportunity for patrons to enjoy the benefits of the digital age. Unfortunately, as hackers become more sophisticated and…

  17. A scoping review of cloud computing in healthcare.

    Science.gov (United States)

    Griebel, Lena; Prokosch, Hans-Ulrich; Köpcke, Felix; Toddenroth, Dennis; Christoph, Jan; Leb, Ines; Engel, Igor; Sedlmayr, Martin

    2015-03-19

    Cloud computing is a recent and fast growing area of development in healthcare. Ubiquitous, on-demand access to virtually endless resources in combination with a pay-per-use model allow for new ways of developing, delivering and using services. Cloud computing is often used in an "OMICS-context", e.g. for computing in genomics, proteomics and molecular medicine, while other field of application still seem to be underrepresented. Thus, the objective of this scoping review was to identify the current state and hot topics in research on cloud computing in healthcare beyond this traditional domain. MEDLINE was searched in July 2013 and in December 2014 for publications containing the terms "cloud computing" and "cloud-based". Each journal and conference article was categorized and summarized independently by two researchers who consolidated their findings. 102 publications have been analyzed and 6 main topics have been found: telemedicine/teleconsultation, medical imaging, public health and patient self-management, hospital management and information systems, therapy, and secondary use of data. Commonly used features are broad network access for sharing and accessing data and rapid elasticity to dynamically adapt to computing demands. Eight articles favor the pay-for-use characteristics of cloud-based services avoiding upfront investments. Nevertheless, while 22 articles present very general potentials of cloud computing in the medical domain and 66 articles describe conceptual or prototypic projects, only 14 articles report from successful implementations. Further, in many articles cloud computing is seen as an analogy to internet-/web-based data sharing and the characteristics of the particular cloud computing approach are unfortunately not really illustrated. Even though cloud computing in healthcare is of growing interest only few successful implementations yet exist and many papers just use the term "cloud" synonymously for "using virtual machines" or "web

  18. Domain fusion analysis by applying relational algebra to protein sequence and domain databases.

    Science.gov (United States)

    Truong, Kevin; Ikura, Mitsuhiko

    2003-05-06

    Domain fusion analysis is a useful method to predict functionally linked proteins that may be involved in direct protein-protein interactions or in the same metabolic or signaling pathway. As separate domain databases like BLOCKS, PROSITE, Pfam, SMART, PRINTS-S, ProDom, TIGRFAMs, and amalgamated domain databases like InterPro continue to grow in size and quality, a computational method to perform domain fusion analysis that leverages on these efforts will become increasingly powerful. This paper proposes a computational method employing relational algebra to find domain fusions in protein sequence databases. The feasibility of this method was illustrated on the SWISS-PROT+TrEMBL sequence database using domain predictions from the Pfam HMM (hidden Markov model) database. We identified 235 and 189 putative functionally linked protein partners in H. sapiens and S. cerevisiae, respectively. From scientific literature, we were able to confirm many of these functional linkages, while the remainder offer testable experimental hypothesis. Results can be viewed at http://calcium.uhnres.utoronto.ca/pi. As the analysis can be computed quickly on any relational database that supports standard SQL (structured query language), it can be dynamically updated along with the sequence and domain databases, thereby improving the quality of predictions over time.

  19. Time-domain seismic modeling in viscoelastic media for full waveform inversion on heterogeneous computing platforms with OpenCL

    Science.gov (United States)

    Fabien-Ouellet, Gabriel; Gloaguen, Erwan; Giroux, Bernard

    2017-03-01

    Full Waveform Inversion (FWI) aims at recovering the elastic parameters of the Earth by matching recordings of the ground motion with the direct solution of the wave equation. Modeling the wave propagation for realistic scenarios is computationally intensive, which limits the applicability of FWI. The current hardware evolution brings increasing parallel computing power that can speed up the computations in FWI. However, to take advantage of the diversity of parallel architectures presently available, new programming approaches are required. In this work, we explore the use of OpenCL to develop a portable code that can take advantage of the many parallel processor architectures now available. We present a program called SeisCL for 2D and 3D viscoelastic FWI in the time domain. The code computes the forward and adjoint wavefields using finite-difference and outputs the gradient of the misfit function given by the adjoint state method. To demonstrate the code portability on different architectures, the performance of SeisCL is tested on three different devices: Intel CPUs, NVidia GPUs and Intel Xeon PHI. Results show that the use of GPUs with OpenCL can speed up the computations by nearly two orders of magnitudes over a single threaded application on the CPU. Although OpenCL allows code portability, we show that some device-specific optimization is still required to get the best performance out of a specific architecture. Using OpenCL in conjunction with MPI allows the domain decomposition of large models on several devices located on different nodes of a cluster. For large enough models, the speedup of the domain decomposition varies quasi-linearly with the number of devices. Finally, we investigate two different approaches to compute the gradient by the adjoint state method and show the significant advantages of using OpenCL for FWI.

  20. Implicit upwind schemes for computational fluid dynamics. Solution by domain decomposition

    International Nuclear Information System (INIS)

    Clerc, S.

    1998-01-01

    In this work, the numerical simulation of fluid dynamics equations is addressed. Implicit upwind schemes of finite volume type are used for this purpose. The first part of the dissertation deals with the improvement of the computational precision in unfavourable situations. A non-conservative treatment of some source terms is studied in order to correct some shortcomings of the usual operator-splitting method. Besides, finite volume schemes based on Godunov's approach are unsuited to compute low Mach number flows. A modification of the up-winding by preconditioning is introduced to correct this defect. The second part deals with the solution of steady-state problems arising from an implicit discretization of the equations. A well-posed linearized boundary value problem is formulated. We prove the convergence of a domain decomposition algorithm of Schwartz type for this problem. This algorithm is implemented either directly, or in a Schur complement framework. Finally, another approach is proposed, which consists in decomposing the non-linear steady state problem. (author)

  1. Improvements in fast-response flood modeling: desktop parallel computing and domain tracking

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV. OF UTAH

    2009-01-01

    It is becoming increasingly important to have the ability to accurately forecast flooding, as flooding accounts for the most losses due to natural disasters in the world and the United States. Flood inundation modeling has been dominated by one-dimensional approaches. These models are computationally efficient and are considered by many engineers to produce reasonably accurate water surface profiles. However, because the profiles estimated in these models must be superimposed on digital elevation data to create a two-dimensional map, the result may be sensitive to the ability of the elevation data to capture relevant features (e.g. dikes/levees, roads, walls, etc...). Moreover, one-dimensional models do not explicitly represent the complex flow processes present in floodplains and urban environments and because two-dimensional models based on the shallow water equations have significantly greater ability to determine flow velocity and direction, the National Research Council (NRC) has recommended that two-dimensional models be used over one-dimensional models for flood inundation studies. This paper has shown that two-dimensional flood modeling computational time can be greatly reduced through the use of Java multithreading on multi-core computers which effectively provides a means for parallel computing on a desktop computer. In addition, this paper has shown that when desktop parallel computing is coupled with a domain tracking algorithm, significant computation time can be eliminated when computations are completed only on inundated cells. The drastic reduction in computational time shown here enhances the ability of two-dimensional flood inundation models to be used as a near-real time flood forecasting tool, engineering, design tool, or planning tool. Perhaps even of greater significance, the reduction in computation time makes the incorporation of risk and uncertainty/ensemble forecasting more feasible for flood inundation modeling (NRC 2000; Sayers et al

  2. Hardware architecture design of image restoration based on time-frequency domain computation

    Science.gov (United States)

    Wen, Bo; Zhang, Jing; Jiao, Zipeng

    2013-10-01

    The image restoration algorithms based on time-frequency domain computation is high maturity and applied widely in engineering. To solve the high-speed implementation of these algorithms, the TFDC hardware architecture is proposed. Firstly, the main module is designed, by analyzing the common processing and numerical calculation. Then, to improve the commonality, the iteration control module is planed for iterative algorithms. In addition, to reduce the computational cost and memory requirements, the necessary optimizations are suggested for the time-consuming module, which include two-dimensional FFT/IFFT and the plural calculation. Eventually, the TFDC hardware architecture is adopted for hardware design of real-time image restoration system. The result proves that, the TFDC hardware architecture and its optimizations can be applied to image restoration algorithms based on TFDC, with good algorithm commonality, hardware realizability and high efficiency.

  3. Frequency Domain Image Filtering Using CUDA

    Directory of Open Access Journals (Sweden)

    Muhammad Awais Rajput

    2014-10-01

    Full Text Available In this paper, we investigate the implementation of image filtering in frequency domain using NVIDIA?s CUDA (Compute Unified Device Architecture. In contrast to signal and image filtering in spatial domain which uses convolution operations and hence is more compute-intensive for filters having larger spatial extent, the frequency domain filtering uses FFT (Fast Fourier Transform which is much faster and significantly reduces the computational complexity of the filtering. We implement the frequency domain filtering on CPU and GPU respectively and analyze the speed-up obtained from the CUDA?s parallel processing paradigm. In order to demonstrate the efficiency of frequency domain filtering on CUDA, we implement three frequency domain filters, i.e., Butterworth, low-pass and Gaussian for processing different sizes of images on CPU and GPU respectively and perform the GPU vs. CPU benchmarks. The results presented in this paper show that the frequency domain filtering with CUDA achieves significant speed-up over the CPU processing in frequency domain with the same level of (output image quality on both the processing architectures

  4. Frequency domain image filtering using cuda

    International Nuclear Information System (INIS)

    Rajput, M.A.; Khan, U.A.

    2014-01-01

    In this paper, we investigate the implementation of image filtering in frequency domain using NVIDIA's CUDA (Compute Unified Device Architecture). In contrast to signal and image filtering in spatial domain which uses convolution operations and hence is more compute-intensive for filters having larger spatial extent, the frequency domain filtering uses FFT (Fast Fourier Transform) which is much faster and significantly reduces the computational complexity of the filtering. We implement the frequency domain filtering on CPU and GPU respectively and analyze the speed-up obtained from the CUDA's parallel processing paradigm. In order to demonstrate the efficiency of frequency domain filtering on CUDA, we implement three frequency domain filters, i.e., Butter worth, low-pass and Gaussian for processing different sizes of images on CPU and GPU respectively and perform the GPU vs. CPU benchmarks. The results presented in this paper show that the frequency domain filtering with CUDA achieves significant speed-up over the CPU processing in frequency domain with the same level of (output) image quality on both the processing architectures. (author)

  5. Public policy and regulatory implications for the implementation of Opportunistic Cloud Computing Services for Enterprises

    DEFF Research Database (Denmark)

    Kuada, Eric; Olesen, Henning; Henten, Anders

    2012-01-01

    Opportunistic Cloud Computing Services (OCCS) is a social network approach to the provisioning and management of cloud computing services for enterprises. This paper discusses how public policy and regulations will impact on OCCS implementation. We rely on documented publicly available government...... and corporate policies on the adoption of cloud computing services and deduce the impact of these policies on their adoption of opportunistic cloud computing services. We conclude that there are regulatory challenges on data protection that raises issues for cloud computing adoption in general; and the lack...... of a single globally accepted data protection standard poses some challenges for very successful implementation of OCCS for companies. However, the direction of current public and corporate policies on cloud computing make a good case for them to try out opportunistic cloud computing services....

  6. Knowledge-based public health situation awareness

    Science.gov (United States)

    Mirhaji, Parsa; Zhang, Jiajie; Srinivasan, Arunkumar; Richesson, Rachel L.; Smith, Jack W.

    2004-09-01

    There have been numerous efforts to create comprehensive databases from multiple sources to monitor the dynamics of public health and most specifically to detect the potential threats of bioterrorism before widespread dissemination. But there are not many evidences for the assertion that these systems are timely and dependable, or can reliably identify man made from natural incident. One must evaluate the value of so called 'syndromic surveillance systems' along with the costs involved in design, development, implementation and maintenance of such systems and the costs involved in investigation of the inevitable false alarms1. In this article we will introduce a new perspective to the problem domain with a shift in paradigm from 'surveillance' toward 'awareness'. As we conceptualize a rather different approach to tackle the problem, we will introduce a different methodology in application of information science, computer science, cognitive science and human-computer interaction concepts in design and development of so called 'public health situation awareness systems'. We will share some of our design and implementation concepts for the prototype system that is under development in the Center for Biosecurity and Public Health Informatics Research, in the University of Texas Health Science Center at Houston. The system is based on a knowledgebase containing ontologies with different layers of abstraction, from multiple domains, that provide the context for information integration, knowledge discovery, interactive data mining, information visualization, information sharing and communications. The modular design of the knowledgebase and its knowledge representation formalism enables incremental evolution of the system from a partial system to a comprehensive knowledgebase of 'public health situation awareness' as it acquires new knowledge through interactions with domain experts or automatic discovery of new knowledge.

  7. Quality criteria for electronic publications in medicine.

    Science.gov (United States)

    Schulz, S; Auhuber, T; Schrader, U; Klar, R

    1998-01-01

    This paper defines "electronic publications in medicine (EPM)" as computer based training programs, databases, knowledge-based systems, multimedia applications and electronic books running on standard platforms and available by usual distribution channels. A detailed catalogue of quality criteria as a basis for development and evaluation of EPMs is presented. The necessity to raise the quality level of electronic publications is stressed considering aspects of domain knowledge, software engineering, media development, interface design and didactics.

  8. Optimizing security of cloud computing within the DoD

    OpenAIRE

    Antedomenico, Noemi

    2010-01-01

    Approved for public release; distribution is unlimited What countermeasures best strengthen the confidentiality, integrity and availability (CIA) of the implementation of cloud computing within the DoD? This question will be answered by analyzing threats and countermeasures within the context of the ten domains comprising the Certified Information System Security Professional (CISSP) Common Body of Knowledge (CBK). The ten domains that will be used in this analysis include access control; ...

  9. Automatic domain updating technique for improving computational efficiency of 2-D flood-inundation simulation

    Science.gov (United States)

    Tanaka, T.; Tachikawa, Y.; Ichikawa, Y.; Yorozu, K.

    2017-12-01

    Flood is one of the most hazardous disasters and causes serious damage to people and property around the world. To prevent/mitigate flood damage through early warning system and/or river management planning, numerical modelling of flood-inundation processes is essential. In a literature, flood-inundation models have been extensively developed and improved to achieve flood flow simulation with complex topography at high resolution. With increasing demands on flood-inundation modelling, its computational burden is now one of the key issues. Improvements of computational efficiency of full shallow water equations are made from various perspectives such as approximations of the momentum equations, parallelization technique, and coarsening approaches. To support these techniques and more improve the computational efficiency of flood-inundation simulations, this study proposes an Automatic Domain Updating (ADU) method of 2-D flood-inundation simulation. The ADU method traces the wet and dry interface and automatically updates the simulation domain in response to the progress and recession of flood propagation. The updating algorithm is as follow: first, to register the simulation cells potentially flooded at initial stage (such as floodplains nearby river channels), and then if a registered cell is flooded, to register its surrounding cells. The time for this additional process is saved by checking only cells at wet and dry interface. The computation time is reduced by skipping the processing time of non-flooded area. This algorithm is easily applied to any types of 2-D flood inundation models. The proposed ADU method is implemented to 2-D local inertial equations for the Yodo River basin, Japan. Case studies for two flood events show that the simulation is finished within two to 10 times smaller time showing the same result as that without the ADU method.

  10. A Fast, Efficient Domain Adaptation Technique for Cross-Domain Electroencephalography(EEG-Based Emotion Recognition

    Directory of Open Access Journals (Sweden)

    Xin Chai

    2017-05-01

    Full Text Available Electroencephalography (EEG-based emotion recognition is an important element in psychiatric health diagnosis for patients. However, the underlying EEG sensor signals are always non-stationary if they are sampled from different experimental sessions or subjects. This results in the deterioration of the classification performance. Domain adaptation methods offer an effective way to reduce the discrepancy of marginal distribution. However, for EEG sensor signals, both marginal and conditional distributions may be mismatched. In addition, the existing domain adaptation strategies always require a high level of additional computation. To address this problem, a novel strategy named adaptive subspace feature matching (ASFM is proposed in this paper in order to integrate both the marginal and conditional distributions within a unified framework (without any labeled samples from target subjects. Specifically, we develop a linear transformation function which matches the marginal distributions of the source and target subspaces without a regularization term. This significantly decreases the time complexity of our domain adaptation procedure. As a result, both marginal and conditional distribution discrepancies between the source domain and unlabeled target domain can be reduced, and logistic regression (LR can be applied to the new source domain in order to train a classifier for use in the target domain, since the aligned source domain follows a distribution which is similar to that of the target domain. We compare our ASFM method with six typical approaches using a public EEG dataset with three affective states: positive, neutral, and negative. Both offline and online evaluations were performed. The subject-to-subject offline experimental results demonstrate that our component achieves a mean accuracy and standard deviation of 80.46% and 6.84%, respectively, as compared with a state-of-the-art method, the subspace alignment auto-encoder (SAAE, which

  11. The public understanding of nanotechnology in the food domain: the hidden role of views on science, technology, and nature.

    Science.gov (United States)

    Vandermoere, Frederic; Blanchemanche, Sandrine; Bieberstein, Andrea; Marette, Stephan; Roosen, Jutta

    2011-03-01

    In spite of great expectations about the potential of nanotechnology, this study shows that people are rather ambiguous and pessimistic about nanotechnology applications in the food domain. Our findings are drawn from a survey of public perceptions about nanotechnology food and nanotechnology food packaging (N = 752). Multinomial logistic regression analyses further reveal that knowledge about food risks and nanotechnology significantly influences people's views about nanotechnology food packaging. However, knowledge variables were unrelated to support for nanofood, suggesting that an increase in people's knowledge might not be sufficient to bridge the gap between the excitement some business leaders in the food sector have and the restraint of the public. Additionally, opposition to nanofood was not related to the use of heuristics but to trust in governmental agencies. Furthermore, the results indicate that public perceptions of nanoscience in the food domain significantly relate to views on science, technology, and nature.

  12. Domain decomposition parallel computing for transient two-phase flow of nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jae Ryong; Yoon, Han Young [KAERI, Daejeon (Korea, Republic of); Choi, Hyoung Gwon [Seoul National University, Seoul (Korea, Republic of)

    2016-05-15

    KAERI (Korea Atomic Energy Research Institute) has been developing a multi-dimensional two-phase flow code named CUPID for multi-physics and multi-scale thermal hydraulics analysis of Light water reactors (LWRs). The CUPID code has been validated against a set of conceptual problems and experimental data. In this work, the CUPID code has been parallelized based on the domain decomposition method with Message passing interface (MPI) library. For domain decomposition, the CUPID code provides both manual and automatic methods with METIS library. For the effective memory management, the Compressed sparse row (CSR) format is adopted, which is one of the methods to represent the sparse asymmetric matrix. CSR format saves only non-zero value and its position (row and column). By performing the verification for the fundamental problem set, the parallelization of the CUPID has been successfully confirmed. Since the scalability of a parallel simulation is generally known to be better for fine mesh system, three different scales of mesh system are considered: 40000 meshes for coarse mesh system, 320000 meshes for mid-size mesh system, and 2560000 meshes for fine mesh system. In the given geometry, both single- and two-phase calculations were conducted. In addition, two types of preconditioners for a matrix solver were compared: Diagonal and incomplete LU preconditioner. In terms of enhancement of the parallel performance, the OpenMP and MPI hybrid parallel computing for a pressure solver was examined. It is revealed that the scalability of hybrid calculation was enhanced for the multi-core parallel computation.

  13. A protein domain interaction interface database: InterPare

    Directory of Open Access Journals (Sweden)

    Lee Jungsul

    2005-08-01

    Full Text Available Abstract Background Most proteins function by interacting with other molecules. Their interaction interfaces are highly conserved throughout evolution to avoid undesirable interactions that lead to fatal disorders in cells. Rational drug discovery includes computational methods to identify the interaction sites of lead compounds to the target molecules. Identifying and classifying protein interaction interfaces on a large scale can help researchers discover drug targets more efficiently. Description We introduce a large-scale protein domain interaction interface database called InterPare http://interpare.net. It contains both inter-chain (between chains interfaces and intra-chain (within chain interfaces. InterPare uses three methods to detect interfaces: 1 the geometric distance method for checking the distance between atoms that belong to different domains, 2 Accessible Surface Area (ASA, a method for detecting the buried region of a protein that is detached from a solvent when forming multimers or complexes, and 3 the Voronoi diagram, a computational geometry method that uses a mathematical definition of interface regions. InterPare includes visualization tools to display protein interior, surface, and interaction interfaces. It also provides statistics such as the amino acid propensities of queried protein according to its interior, surface, and interface region. The atom coordinates that belong to interface, surface, and interior regions can be downloaded from the website. Conclusion InterPare is an open and public database server for protein interaction interface information. It contains the large-scale interface data for proteins whose 3D-structures are known. As of November 2004, there were 10,583 (Geometric distance, 10,431 (ASA, and 11,010 (Voronoi diagram entries in the Protein Data Bank (PDB containing interfaces, according to the above three methods. In the case of the geometric distance method, there are 31,620 inter-chain domain-domain

  14. Domain Immersion Technique And Free Surface Computations Applied To Extrusion And Mixing Processes

    Science.gov (United States)

    Valette, Rudy; Vergnes, Bruno; Basset, Olivier; Coupez, Thierry

    2007-04-01

    This work focuses on the development of numerical techniques devoted to the simulation of mixing processes of complex fluids such as twin-screw extrusion or batch mixing. In mixing process simulation, the absence of symmetry of the moving boundaries (the screws or the rotors) implies that their rigid body motion has to be taken into account by using a special treatment. We therefore use a mesh immersion technique (MIT), which consists in using a P1+/P1-based (MINI-element) mixed finite element method for solving the velocity-pressure problem and then solving the problem in the whole barrel cavity by imposing a rigid motion (rotation) to nodes found located inside the so called immersed domain, each subdomain (screw, rotor) being represented by a surface CAD mesh (or its mathematical equation in simple cases). The independent meshes are immersed into a unique backgound computational mesh by computing the distance function to their boundaries. Intersections of meshes are accounted for, allowing to compute a fill factor usable as for the VOF methodology. This technique, combined with the use of parallel computing, allows to compute the time-dependent flow of generalized Newtonian fluids including yield stress fluids in a complex system such as a twin screw extruder, including moving free surfaces, which are treated by a "level set" and Hamilton-Jacobi method.

  15. 76 FR 67418 - Request for Comments on NIST Special Publication 500-293, US Government Cloud Computing...

    Science.gov (United States)

    2011-11-01

    ...-1659-01] Request for Comments on NIST Special Publication 500-293, US Government Cloud Computing... Publication 500-293, US Government Cloud Computing Technology Roadmap, Release 1.0 (Draft). This document is... (USG) agencies to accelerate their adoption of cloud computing. The roadmap has been developed through...

  16. Towards an information strategy for combating identity fraud in the public domain: Cases from healthcare and criminal justice

    NARCIS (Netherlands)

    Plomp, M.G.A.; Grijpink, J.H.A.M.

    2011-01-01

    Two trends are present in both the private and public domain: increasing interorganisational co-operation and increasing digitisation. More and more processes within and between organisations take place electronically, on local, national and European scale. The technological and organisational

  17. Gamma-Weighted Discrete Ordinate Two-Stream Approximation for Computation of Domain Averaged Solar Irradiance

    Science.gov (United States)

    Kato, S.; Smith, G. L.; Barker, H. W.

    2001-01-01

    An algorithm is developed for the gamma-weighted discrete ordinate two-stream approximation that computes profiles of domain-averaged shortwave irradiances for horizontally inhomogeneous cloudy atmospheres. The algorithm assumes that frequency distributions of cloud optical depth at unresolved scales can be represented by a gamma distribution though it neglects net horizontal transport of radiation. This algorithm is an alternative to the one used in earlier studies that adopted the adding method. At present, only overcast cloudy layers are permitted.

  18. Computational Ecology and Software (http://www.iaees.org/publications/journals/ces/online-version.asp

    Directory of Open Access Journals (Sweden)

    ces@iaees.org

    Full Text Available Computational Ecology and Software ISSN 2220-721X URL: http://www.iaees.org/publications/journals/ces/online-version.asp RSS: http://www.iaees.org/publications/journals/ces/rss.xml E-mail: ces@iaees.org Editor-in-Chief: WenJun Zhang Aims and Scope COMPUTATIONAL ECOLOGY AND SOFTWARE (ISSN 2220-721X is an open access, peer-reviewed online journal that considers scientific articles in all different areas of computational ecology. It is the transactions of the International Society of Computational Ecology. The journal is concerned with the ecological researches, constructions and applications of theories and methods of computational sciences including computational mathematics, computational statistics and computer science. It features the simulation, approximation, prediction, recognition, and classification of ecological issues. Intensive computation is one of the major stresses of the journal. The journal welcomes research articles, short communications, review articles, perspectives, and book reviews. The journal also supports the activities of the International Society of Computational Ecology. The topics to be covered by CES include, but are not limited to: •Computation intensive methods, numerical and optimization methods, differential and difference equation modeling and simulation, prediction, recognition, classification, statistical computation (Bayesian computing, randomization, bootstrapping, Monte Carlo techniques, stochastic process, etc., agent-based modeling, individual-based modeling, artificial neural networks, knowledge based systems, machine learning, genetic algorithms, data exploration, network analysis and computation, databases, ecological modeling and computation using Geographical Information Systems, satellite imagery, and other computation intensive theories and methods. •Artificial ecosystems, artificial life, complexity of ecosystems and virtual reality. •The development, evaluation and validation of software and

  19. Towards Domain-specific Flow-based Languages

    DEFF Research Database (Denmark)

    Zarrin, Bahram; Baumeister, Hubert; Sarjoughian, Hessam S.

    2018-01-01

    describe their problems and solutions, instead of using general purpose programming languages. The goal of these languages is to improve the productivity and efficiency of the development and simulation of concurrent scientific models and systems. Moreover, they help to expose parallelism and to specify...... the concurrency within a component or across different independent components. In this paper, we introduce the concept of domain-specific flowbased languages which allows domain experts to use flow-based languages adapted to a particular problem domain. Flow-based programming is used to support concurrency, while......Due to the significant growth of the demand for data-intensive computing, in addition to the emergence of new parallel and distributed computing technologies, scientists and domain experts are leveraging languages specialized for their problem domain, i.e., domain-specific languages, to help them...

  20. Computational domain discretization in numerical analysis of flow within granular materials

    Science.gov (United States)

    Sosnowski, Marcin

    2018-06-01

    The discretization of computational domain is a crucial step in Computational Fluid Dynamics (CFD) because it influences not only the numerical stability of the analysed model but also the agreement of obtained results and real data. Modelling flow in packed beds of granular materials is a very challenging task in terms of discretization due to the existence of narrow spaces between spherical granules contacting tangentially in a single point. Standard approach to this issue results in a low quality mesh and unreliable results in consequence. Therefore the common method is to reduce the diameter of the modelled granules in order to eliminate the single-point contact between the individual granules. The drawback of such method is the adulteration of flow and contact heat resistance among others. Therefore an innovative method is proposed in the paper: single-point contact is extended to a cylinder-shaped volume contact. Such approach eliminates the low quality mesh elements and simultaneously introduces only slight distortion to the flow as well as contact heat transfer. The performed analysis of numerous test cases prove the great potential of the proposed method of meshing the packed beds of granular materials.

  1. Combining Public Domain and Professional Panoramic Imagery for the Accurate and Dense 3d Reconstruction of the Destroyed Bel Temple in Palmyra

    Science.gov (United States)

    Wahbeh, W.; Nebiker, S.; Fangi, G.

    2016-06-01

    This paper exploits the potential of dense multi-image 3d reconstruction of destroyed cultural heritage monuments by either using public domain touristic imagery only or by combining the public domain imagery with professional panoramic imagery. The focus of our work is placed on the reconstruction of the temple of Bel, one of the Syrian heritage monuments, which was destroyed in September 2015 by the so called "Islamic State". The great temple of Bel is considered as one of the most important religious buildings of the 1st century AD in the East with a unique design. The investigations and the reconstruction were carried out using two types of imagery. The first are freely available generic touristic photos collected from the web. The second are panoramic images captured in 2010 for documenting those monuments. In the paper we present a 3d reconstruction workflow for both types of imagery using state-of-the art dense image matching software, addressing the non-trivial challenges of combining uncalibrated public domain imagery with panoramic images with very wide base-lines. We subsequently investigate the aspects of accuracy and completeness obtainable from the public domain touristic images alone and from the combination with spherical panoramas. We furthermore discuss the challenges of co-registering the weakly connected 3d point cloud fragments resulting from the limited coverage of the touristic photos. We then describe an approach using spherical photogrammetry as a virtual topographic survey allowing the co-registration of a detailed and accurate single 3d model of the temple interior and exterior.

  2. The Administrative Impact of Computers on the British Columbia Public School System.

    Science.gov (United States)

    Gibbens, Trevor P.

    This case study analyzes and evaluates the administrative computer systems in the British Columbia public school organization in order to investigate the costs and benefits of computers, their impact on managerial work, their influence on centralization in organizations, and the relationship between organizational objectives and the design of…

  3. A quantitative evaluation of the relative status of journal and conference publications in computer science.

    OpenAIRE

    Coyle, Lorcan; Freyne, Jill; Smyth, Barry; Cunningham, Padraig

    2010-01-01

    While it is universally held by computer scientists that conference publications have a higher status in computer science than in other disciplines there is little quantitative evidence in support of this position. The importance of journal publications in academic promotion makes this a big issue since an exclusive focus on journal papers will miss many significant papers published at conferences in computer science. In this paper we set out to quantify the relative importance of journ...

  4. A Computer-Assisted Instruction in Teaching Abstract Statistics to Public Affairs Undergraduates

    Science.gov (United States)

    Ozturk, Ali Osman

    2012-01-01

    This article attempts to demonstrate the applicability of a computer-assisted instruction supported with simulated data in teaching abstract statistical concepts to political science and public affairs students in an introductory research methods course. The software is called the Elaboration Model Computer Exercise (EMCE) in that it takes a great…

  5. Enhancing public access to legal information : A proposal for a new official legal information generic top-level domain

    NARCIS (Netherlands)

    Mitee, Leesi Ebenezer

    2017-01-01

    Abstract: This article examines the use of a new legal information generic Top-Level Domain (gTLD) as a viable tool for easy identification of official legal information websites (OLIWs) and enhancing global public access to their resources. This intervention is necessary because of the existence of

  6. Design and development of semantic web-based system for computer science domain-specific information retrieval

    Directory of Open Access Journals (Sweden)

    Ritika Bansal

    2016-09-01

    Full Text Available In semantic web-based system, the concept of ontology is used to search results by contextual meaning of input query instead of keyword matching. From the research literature, there seems to be a need for a tool which can provide an easy interface for complex queries in natural language that can retrieve the domain-specific information from the ontology. This research paper proposes an IRSCSD system (Information retrieval system for computer science domain as a solution. This system offers advanced querying and browsing of structured data with search results automatically aggregated and rendered directly in a consistent user-interface, thus reducing the manual effort of users. So, the main objective of this research is design and development of semantic web-based system for integrating ontology towards domain-specific retrieval support. Methodology followed is a piecemeal research which involves the following stages. First Stage involves the designing of framework for semantic web-based system. Second stage builds the prototype for the framework using Protégé tool. Third Stage deals with the natural language query conversion into SPARQL query language using Python-based QUEPY framework. Fourth Stage involves firing of converted SPARQL queries to the ontology through Apache's Jena API to fetch the results. Lastly, evaluation of the prototype has been done in order to ensure its efficiency and usability. Thus, this research paper throws light on framework development for semantic web-based system that assists in efficient retrieval of domain-specific information, natural language query interpretation into semantic web language, creation of domain-specific ontology and its mapping with related ontology. This research paper also provides approaches and metrics for ontology evaluation on prototype ontology developed to study the performance based on accessibility of required domain-related information.

  7. Into the Dark Domain: The UK Web Archive as a Source for the Contemporary History of Public Health

    Science.gov (United States)

    Gorsky, Martin

    2015-01-01

    With the migration of the written record from paper to digital format, archivists and historians must urgently consider how web content should be conserved, retrieved and analysed. The British Library has recently acquired a large number of UK domain websites, captured 1996–2010, which is colloquially termed the Dark Domain Archive while technical issues surrounding user access are resolved. This article reports the results of an invited pilot project that explores methodological issues surrounding use of this archive. It asks how the relationship between UK public health and local government was represented on the web, drawing on the ‘declinist’ historiography to frame its questions. It points up some difficulties in developing an aggregate picture of web content due to duplication of sites. It also highlights their potential for thematic and discourse analysis, using both text and image, illustrated through an argument about the contradictory rationale for public health policy under New Labour. PMID:26217072

  8. Wavefield extrapolation in pseudodepth domain

    KAUST Repository

    Ma, Xuxin

    2013-02-01

    Wavefields are commonly computed in the Cartesian coordinate frame. Its efficiency is inherently limited due to spatial oversampling in deep layers, where the velocity is high and wavelengths are long. To alleviate this computational waste due to uneven wavelength sampling, we convert the vertical axis of the conventional domain from depth to vertical time or pseudodepth. This creates a nonorthognal Riemannian coordinate system. Isotropic and anisotropic wavefields can be extrapolated in the new coordinate frame with improved efficiency and good consistency with Cartesian domain extrapolation results. Prestack depth migrations are also evaluated based on the wavefield extrapolation in the pseudodepth domain.© 2013 Society of Exploration Geophysicists. All rights reserved.

  9. Computer architecture a quantitative approach

    CERN Document Server

    Hennessy, John L

    2019-01-01

    Computer Architecture: A Quantitative Approach, Sixth Edition has been considered essential reading by instructors, students and practitioners of computer design for over 20 years. The sixth edition of this classic textbook is fully revised with the latest developments in processor and system architecture. It now features examples from the RISC-V (RISC Five) instruction set architecture, a modern RISC instruction set developed and designed to be a free and openly adoptable standard. It also includes a new chapter on domain-specific architectures and an updated chapter on warehouse-scale computing that features the first public information on Google's newest WSC. True to its original mission of demystifying computer architecture, this edition continues the longstanding tradition of focusing on areas where the most exciting computing innovation is happening, while always keeping an emphasis on good engineering design.

  10. Exploratory analysis regarding the domain definitions for computer based analytical models

    Science.gov (United States)

    Raicu, A.; Oanta, E.; Barhalescu, M.

    2017-08-01

    Our previous computer based studies dedicated to structural problems using analytical methods defined the composite cross section of a beam as a result of Boolean operations with so-called ‘simple’ shapes. Using generalisations, in the class of the ‘simple’ shapes were included areas bounded by curves approximated using spline functions and areas approximated as polygons. However, particular definitions lead to particular solutions. In order to ascend above the actual limitations, we conceived a general definition of the cross sections that are considered now calculus domains consisting of several subdomains. The according set of input data use complex parameterizations. This new vision allows us to naturally assign a general number of attributes to the subdomains. In this way there may be modelled new phenomena that use map-wise information, such as the metal alloys equilibrium diagrams. The hierarchy of the input data text files that use the comma-separated-value format and their structure are also presented and discussed in the paper. This new approach allows us to reuse the concepts and part of the data processing software instruments already developed. The according software to be subsequently developed will be modularised and generalised in order to be used in the upcoming projects that require rapid development of computer based models.

  11. Application of ubiquitous computing in personal health monitoring systems.

    Science.gov (United States)

    Kunze, C; Grossmann, U; Stork, W; Müller-Glaser, K D

    2002-01-01

    A possibility to significantly reduce the costs of public health systems is to increasingly use information technology. The Laboratory for Information Processing Technology (ITIV) at the University of Karlsruhe is developing a personal health monitoring system, which should improve health care and at the same time reduce costs by combining micro-technological smart sensors with personalized, mobile computing systems. In this paper we present how ubiquitous computing theory can be applied in the health-care domain.

  12. Computed tear film and osmolarity dynamics on an eye-shaped domain

    Science.gov (United States)

    Li, Longfei; Braun, Richard J.; Driscoll, Tobin A.; Henshaw, William D.; Banks, Jeffrey W.; King-Smith, P. Ewen

    2016-01-01

    The concentration of ions, or osmolarity, in the tear film is a key variable in understanding dry eye symptoms and disease. In this manuscript, we derive a mathematical model that couples osmolarity (treated as a single solute) and fluid dynamics within the tear film on a 2D eye-shaped domain. The model includes the physical effects of evaporation, surface tension, viscosity, ocular surface wettability, osmolarity, osmosis and tear fluid supply and drainage. The governing system of coupled non-linear partial differential equations is solved using the Overture computational framework, together with a hybrid time-stepping scheme, using a variable step backward differentiation formula and a Runge–Kutta–Chebyshev method that were added to the framework. The results of our numerical simulations provide new insight into the osmolarity distribution over the ocular surface during the interblink. PMID:25883248

  13. Exposure to simultaneous sedentary behavior domains and sociodemographic factors associated in public servants

    Directory of Open Access Journals (Sweden)

    Fernanda Cerveira Fronza

    2017-11-01

    Full Text Available DOI: http://dx.doi.org/10.5007/1980-0037.2017v19n4p469   Exposure to sedentary behavior may contribute to health problems. This study aimed to estimate the prevalence of exposure to simultaneous sedentary behavior domains and verify associated sociodemographic characteristics among technical and administrative servers of a Brazilian university. This is a cross-sectional epidemiological study carried out with 623 technical and administrative servers. Sedentary behavior was identified through a questionnaire in the following domains: commuting (active / passive, sitting time at work, daily time spent watching television and computer use (≥3 hours / day. Sociodemographic variables were age, sex and educational level. The prevalence of servers that had one, two, three and four simultaneous sedentary behavior was 28.4%, 43.2%, 22.5% and 4.3%, respectively. Women were more likely to have three sedentary behavior simultaneously (OR = 1.61, CI 95% = 1.02, 2.56. Servers with 9-11 years of schooling were less exposed to two (OR = 0.27, CI 95% = 0.17, 0.44, three (OR = 0.39, CI 95% = 0.23, 0.66 and four (OR = 0.22, CI 95% = 0.07; 0.69 sedentary behavior simultaneously and those over 12 years of schooling were less likely of having two (OR = 0.22, CI 95% = 0.10; 0.49 and three (OR = 0.15, CI 95% = 0.05, 0.46 sedentary behavior simultaneously. More than half of servers have two sedentary behavior during the week. Having sedentary behavior in more than one domain simultaneously was associated with sex and educational level.

  14. The Importance of Computer Science for Public Health Training: An Opportunity and Call to Action.

    Science.gov (United States)

    Kunkle, Sarah; Christie, Gillian; Yach, Derek; El-Sayed, Abdulrahman M

    2016-01-01

    A century ago, the Welch-Rose Report established a public health education system in the United States. Since then, the system has evolved to address emerging health needs and integrate new technologies. Today, personalized health technologies generate large amounts of data. Emerging computer science techniques, such as machine learning, present an opportunity to extract insights from these data that could help identify high-risk individuals and tailor health interventions and recommendations. As these technologies play a larger role in health promotion, collaboration between the public health and technology communities will become the norm. Offering public health trainees coursework in computer science alongside traditional public health disciplines will facilitate this evolution, improving public health's capacity to harness these technologies to improve population health.

  15. Computational design of selective peptides to discriminate between similar PDZ domains in an oncogenic pathway.

    Science.gov (United States)

    Zheng, Fan; Jewell, Heather; Fitzpatrick, Jeremy; Zhang, Jian; Mierke, Dale F; Grigoryan, Gevorg

    2015-01-30

    Reagents that target protein-protein interactions to rewire signaling are of great relevance in biological research. Computational protein design may offer a means of creating such reagents on demand, but methods for encoding targeting selectivity are sorely needed. This is especially challenging when targeting interactions with ubiquitous recognition modules--for example, PDZ domains, which bind C-terminal sequences of partner proteins. Here we consider the problem of designing selective PDZ inhibitor peptides in the context of an oncogenic signaling pathway, in which two PDZ domains (NHERF-2 PDZ2-N2P2 and MAGI-3 PDZ6-M3P6) compete for a receptor C-terminus to differentially modulate oncogenic activities. Because N2P2 has been shown to increase tumorigenicity and M3P6 to decreases it, we sought to design peptides that inhibit N2P2 without affecting M3P6. We developed a structure-based computational design framework that models peptide flexibility in binding yet is efficient enough to rapidly analyze tradeoffs between affinity and selectivity. Designed peptides showed low-micromolar inhibition constants for N2P2 and no detectable M3P6 binding. Peptides designed for reverse discrimination bound M3P6 tighter than N2P2, further testing our technology. Experimental and computational analysis of selectivity determinants revealed significant indirect energetic coupling in the binding site. Successful discrimination between N2P2 and M3P6, despite their overlapping binding preferences, is highly encouraging for computational approaches to selective PDZ targeting, especially because design relied on a homology model of M3P6. Still, we demonstrate specific deficiencies of structural modeling that must be addressed to enable truly robust design. The presented framework is general and can be applied in many scenarios to engineer selective targeting. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Computational design of a PDZ domain peptide inhibitor that rescues CFTR activity.

    Directory of Open Access Journals (Sweden)

    Kyle E Roberts

    Full Text Available The cystic fibrosis transmembrane conductance regulator (CFTR is an epithelial chloride channel mutated in patients with cystic fibrosis (CF. The most prevalent CFTR mutation, ΔF508, blocks folding in the endoplasmic reticulum. Recent work has shown that some ΔF508-CFTR channel activity can be recovered by pharmaceutical modulators ("potentiators" and "correctors", but ΔF508-CFTR can still be rapidly degraded via a lysosomal pathway involving the CFTR-associated ligand (CAL, which binds CFTR via a PDZ interaction domain. We present a study that goes from theory, to new structure-based computational design algorithms, to computational predictions, to biochemical testing and ultimately to epithelial-cell validation of novel, effective CAL PDZ inhibitors (called "stabilizers" that rescue ΔF508-CFTR activity. To design the "stabilizers", we extended our structural ensemble-based computational protein redesign algorithm K* to encompass protein-protein and protein-peptide interactions. The computational predictions achieved high accuracy: all of the top-predicted peptide inhibitors bound well to CAL. Furthermore, when compared to state-of-the-art CAL inhibitors, our design methodology achieved higher affinity and increased binding efficiency. The designed inhibitor with the highest affinity for CAL (kCAL01 binds six-fold more tightly than the previous best hexamer (iCAL35, and 170-fold more tightly than the CFTR C-terminus. We show that kCAL01 has physiological activity and can rescue chloride efflux in CF patient-derived airway epithelial cells. Since stabilizers address a different cellular CF defect from potentiators and correctors, our inhibitors provide an additional therapeutic pathway that can be used in conjunction with current methods.

  17. Dynamics of domain coverage of the protein sequence universe

    Science.gov (United States)

    2012-01-01

    Background The currently known protein sequence space consists of millions of sequences in public databases and is rapidly expanding. Assigning sequences to families leads to a better understanding of protein function and the nature of the protein universe. However, a large portion of the current protein space remains unassigned and is referred to as its “dark matter”. Results Here we suggest that true size of “dark matter” is much larger than stated by current definitions. We propose an approach to reducing the size of “dark matter” by identifying and subtracting regions in protein sequences that are not likely to contain any domain. Conclusions Recent improvements in computational domain modeling result in a decrease, albeit slowly, in the relative size of “dark matter”; however, its absolute size increases substantially with the growth of sequence data. PMID:23157439

  18. Dynamics of domain coverage of the protein sequence universe

    Directory of Open Access Journals (Sweden)

    Rekapalli Bhanu

    2012-11-01

    Full Text Available Abstract Background The currently known protein sequence space consists of millions of sequences in public databases and is rapidly expanding. Assigning sequences to families leads to a better understanding of protein function and the nature of the protein universe. However, a large portion of the current protein space remains unassigned and is referred to as its “dark matter”. Results Here we suggest that true size of “dark matter” is much larger than stated by current definitions. We propose an approach to reducing the size of “dark matter” by identifying and subtracting regions in protein sequences that are not likely to contain any domain. Conclusions Recent improvements in computational domain modeling result in a decrease, albeit slowly, in the relative size of “dark matter”; however, its absolute size increases substantially with the growth of sequence data.

  19. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    Science.gov (United States)

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved. PMID:24078906

  20. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Weizhe Zhang

    2013-01-01

    Full Text Available Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  1. Secure encapsulation and publication of biological services in the cloud computing environment.

    Science.gov (United States)

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  2. Experience of public procurement of Open Compute servers

    Science.gov (United States)

    Bärring, Olof; Guerri, Marco; Bonfillou, Eric; Valsan, Liviu; Grigore, Alexandru; Dore, Vincent; Gentit, Alain; Clement, Benoît; Grossir, Anthony

    2015-12-01

    The Open Compute Project. OCP (http://www.opencompute.org/). was launched by Facebook in 2011 with the objective of building efficient computing infrastructures at the lowest possible cost. The technologies are released as open hardware. with the goal to develop servers and data centres following the model traditionally associated with open source software projects. In 2013 CERN acquired a few OCP servers in order to compare performance and power consumption with standard hardware. The conclusions were that there are sufficient savings to motivate an attempt to procure a large scale installation. One objective is to evaluate if the OCP market is sufficiently mature and broad enough to meet the constraints of a public procurement. This paper summarizes this procurement. which started in September 2014 and involved the Request for information (RFI) to qualify bidders and Request for Tender (RFT).

  3. EMGAN: A computer program for time and frequency domain reduction of electromyographic data

    Science.gov (United States)

    Hursta, W. N.

    1975-01-01

    An experiment in electromyography utilizing surface electrode techniques was developed for the Apollo-Soyuz test project. This report describes the computer program, EMGAN, which was written to provide first order data reduction for the experiment. EMG signals are produced by the membrane depolarization of muscle fibers during a muscle contraction. Surface electrodes detect a spatially summated signal from a large number of muscle fibers commonly called an interference pattern. An interference pattern is usually so complex that analysis through signal morphology is extremely difficult if not impossible. It has become common to process EMG interference patterns in the frequency domain. Muscle fatigue and certain myopathic conditions are recognized through changes in muscle frequency spectra.

  4. DIMA 3.0: Domain Interaction Map.

    Science.gov (United States)

    Luo, Qibin; Pagel, Philipp; Vilne, Baiba; Frishman, Dmitrij

    2011-01-01

    Domain Interaction MAp (DIMA, available at http://webclu.bio.wzw.tum.de/dima) is a database of predicted and known interactions between protein domains. It integrates 5807 structurally known interactions imported from the iPfam and 3did databases and 46,900 domain interactions predicted by four computational methods: domain phylogenetic profiling, domain pair exclusion algorithm correlated mutations and domain interaction prediction in a discriminative way. Additionally predictions are filtered to exclude those domain pairs that are reported as non-interacting by the Negatome database. The DIMA Web site allows to calculate domain interaction networks either for a domain of interest or for entire organisms, and to explore them interactively using the Flash-based Cytoscape Web software.

  5. Computational thinking as an emerging competence domain

    NARCIS (Netherlands)

    Yadav, A.; Good, J.; Voogt, J.; Fisser, P.; Mulder, M.

    2016-01-01

    Computational thinking is a problem-solving skill set, which includes problem decomposition, algorithmic thinking, abstraction, and automation. Even though computational thinking draws upon concepts fundamental to computer science (CS), it has broad application to all disciplines. It has been

  6. Prion-Like Domains in Phagobiota

    Directory of Open Access Journals (Sweden)

    George Tetz

    2017-11-01

    Full Text Available Prions are molecules characterized by self-propagation, which can undergo a conformational switch leading to the creation of new prions. Prion proteins have originally been associated with the development of mammalian pathologies; however, recently they have been shown to contribute to the environmental adaptation in a variety of prokaryotic and eukaryotic organisms. Bacteriophages are widespread and represent the important regulators of microbiota homeostasis and have been shown to be diverse across various bacterial families. Here, we examined whether bacteriophages contain prion-like proteins and whether these prion-like protein domains are involved in the regulation of homeostasis. We used a computational algorithm, prion-like amino acid composition, to detect prion-like domains in 370,617 publicly available bacteriophage protein sequences, which resulted in the identification of 5040 putative prions. We analyzed a set of these prion-like proteins, and observed regularities in their distribution across different phage families, associated with their interactions with the bacterial host cells. We found that prion-like domains could be found across all phages of various groups of bacteria and archaea. The results obtained in this study indicate that bacteriophage prion-like proteins are predominantly involved in the interactions between bacteriophages and bacterial cell, such as those associated with the attachment and penetration of bacteriophage in the cell, and the release of the phage progeny. These data allow the identification of phage prion-like proteins as novel regulators of the interactions between bacteriophages and bacterial cells.

  7. Expanding the landscape of chromatin modification (CM-related functional domains and genes in human.

    Directory of Open Access Journals (Sweden)

    Shuye Pu

    2010-11-01

    Full Text Available Chromatin modification (CM plays a key role in regulating transcription, DNA replication, repair and recombination. However, our knowledge of these processes in humans remains very limited. Here we use computational approaches to study proteins and functional domains involved in CM in humans. We analyze the abundance and the pair-wise domain-domain co-occurrences of 25 well-documented CM domains in 5 model organisms: yeast, worm, fly, mouse and human. Results show that domains involved in histone methylation, DNA methylation, and histone variants are remarkably expanded in metazoan, reflecting the increased demand for cell type-specific gene regulation. We find that CM domains tend to co-occur with a limited number of partner domains and are hence not promiscuous. This property is exploited to identify 47 potentially novel CM domains, including 24 DNA-binding domains, whose role in CM has received little attention so far. Lastly, we use a consensus Machine Learning approach to predict 379 novel CM genes (coding for 329 proteins in humans based on domain compositions. Several of these predictions are supported by very recent experimental studies and others are slated for experimental verification. Identification of novel CM genes and domains in humans will aid our understanding of fundamental epigenetic processes that are important for stem cell differentiation and cancer biology. Information on all the candidate CM domains and genes reported here is publicly available.

  8. Identifying the impact of G-quadruplexes on Affymetrix 3' arrays using cloud computing.

    Science.gov (United States)

    Memon, Farhat N; Owen, Anne M; Sanchez-Graillet, Olivia; Upton, Graham J G; Harrison, Andrew P

    2010-01-15

    A tetramer quadruplex structure is formed by four parallel strands of DNA/ RNA containing runs of guanine. These quadruplexes are able to form because guanine can Hoogsteen hydrogen bond to other guanines, and a tetrad of guanines can form a stable arrangement. Recently we have discovered that probes on Affymetrix GeneChips that contain runs of guanine do not measure gene expression reliably. We associate this finding with the likelihood that quadruplexes are forming on the surface of GeneChips. In order to cope with the rapidly expanding size of GeneChip array datasets in the public domain, we are exploring the use of cloud computing to replicate our experiments on 3' arrays to look at the effect of the location of G-spots (runs of guanines). Cloud computing is a recently introduced high-performance solution that takes advantage of the computational infrastructure of large organisations such as Amazon and Google. We expect that cloud computing will become widely adopted because it enables bioinformaticians to avoid capital expenditure on expensive computing resources and to only pay a cloud computing provider for what is used. Moreover, as well as financial efficiency, cloud computing is an ecologically-friendly technology, it enables efficient data-sharing and we expect it to be faster for development purposes. Here we propose the advantageous use of cloud computing to perform a large data-mining analysis of public domain 3' arrays.

  9. Design and study of parallel computing environment of Monte Carlo simulation for particle therapy planning using a public cloud-computing infrastructure

    International Nuclear Information System (INIS)

    Yokohama, Noriya

    2013-01-01

    This report was aimed at structuring the design of architectures and studying performance measurement of a parallel computing environment using a Monte Carlo simulation for particle therapy using a high performance computing (HPC) instance within a public cloud-computing infrastructure. Performance measurements showed an approximately 28 times faster speed than seen with single-thread architecture, combined with improved stability. A study of methods of optimizing the system operations also indicated lower cost. (author)

  10. DAE emergency response centre (ERC) at Kalpakkam for response to nuclear and radiological emergencies in public domain

    International Nuclear Information System (INIS)

    Meenakshisundaram, V.; Rajagopal, V.; Mathiyarasu, R.; Subramanian, V.; Rajaram, S.; Somayaji, K.M.; Kannan, V.; Rajagopalan, H.

    2008-01-01

    In India, Department of Atomic Energy (DAE) has been identified as the nodal agency/authority in respect of providing the necessary technical inputs in the event of any radiation emergency that may occur in public domain. The overall system takes into consideration statutory requirements, executive decisions as well as National and International obligations. This paper highlights the details about the strength of the Kalpakkam ERC and other essential requisites and their compliance since its formation

  11. Effects of clinically relevant MPL mutations in the transmembrane domain revealed at the atomic level through computational modeling.

    Science.gov (United States)

    Lee, Tai-Sung; Kantarjian, Hagop; Ma, Wanlong; Yeh, Chen-Hsiung; Giles, Francis; Albitar, Maher

    2011-01-01

    Mutations in the thrombopoietin receptor (MPL) may activate relevant pathways and lead to chronic myeloproliferative neoplasms (MPNs). The mechanisms of MPL activation remain elusive because of a lack of experimental structures. Modern computational biology techniques were utilized to explore the mechanisms of MPL protein activation due to various mutations. Transmembrane (TM) domain predictions, homology modeling, ab initio protein structure prediction, and molecular dynamics (MD) simulations were used to build structural dynamic models of wild-type and four clinically observed mutants of MPL. The simulation results suggest that S505 and W515 are important in keeping the TM domain in its correct position within the membrane. Mutations at either of these two positions cause movement of the TM domain, altering the conformation of the nearby intracellular domain in unexpected ways, and may cause the unwanted constitutive activation of MPL's kinase partner, JAK2. Our findings represent the first full-scale molecular dynamics simulations of the wild-type and clinically observed mutants of the MPL protein, a critical element of the MPL-JAK2-STAT signaling pathway. In contrast to usual explanations for the activation mechanism that are based on the relative translational movement between rigid domains of MPL, our results suggest that mutations within the TM region could result in conformational changes including tilt and rotation (azimuthal) angles along the membrane axis. Such changes may significantly alter the conformation of the adjacent and intrinsically flexible intracellular domain. Hence, caution should be exercised when interpreting experimental evidence based on rigid models of cytokine receptors or similar systems.

  12. The Impact of Social Computing on Public Services : a Rationale for Government 2 . 0

    NARCIS (Netherlands)

    Broek, Tijs Van Den; Frissen, Valerie; Huijboom, Noor; Punie, Yves

    2010-01-01

    In this article the impact of the fast emerging social computing trend on the public sector is explored. This exploration is based on the results of a study1 Prospective and Technological Studies (IPTS)2 commissioned by the Institute for . Three cases of social computing initiatives in diverse

  13. Domain similarity based orthology detection.

    Science.gov (United States)

    Bitard-Feildel, Tristan; Kemena, Carsten; Greenwood, Jenny M; Bornberg-Bauer, Erich

    2015-05-13

    Orthologous protein detection software mostly uses pairwise comparisons of amino-acid sequences to assert whether two proteins are orthologous or not. Accordingly, when the number of sequences for comparison increases, the number of comparisons to compute grows in a quadratic order. A current challenge of bioinformatic research, especially when taking into account the increasing number of sequenced organisms available, is to make this ever-growing number of comparisons computationally feasible in a reasonable amount of time. We propose to speed up the detection of orthologous proteins by using strings of domains to characterize the proteins. We present two new protein similarity measures, a cosine and a maximal weight matching score based on domain content similarity, and new software, named porthoDom. The qualities of the cosine and the maximal weight matching similarity measures are compared against curated datasets. The measures show that domain content similarities are able to correctly group proteins into their families. Accordingly, the cosine similarity measure is used inside porthoDom, the wrapper developed for proteinortho. porthoDom makes use of domain content similarity measures to group proteins together before searching for orthologs. By using domains instead of amino acid sequences, the reduction of the search space decreases the computational complexity of an all-against-all sequence comparison. We demonstrate that representing and comparing proteins as strings of discrete domains, i.e. as a concatenation of their unique identifiers, allows a drastic simplification of search space. porthoDom has the advantage of speeding up orthology detection while maintaining a degree of accuracy similar to proteinortho. The implementation of porthoDom is released using python and C++ languages and is available under the GNU GPL licence 3 at http://www.bornberglab.org/pages/porthoda .

  14. A Human-Centred Tangible approach to learning Computational Thinking

    Directory of Open Access Journals (Sweden)

    Tommaso Turchi

    2016-08-01

    Full Text Available Computational Thinking has recently become a focus of many teaching and research domains; it encapsulates those thinking skills integral to solving complex problems using a computer, thus being widely applicable in our society. It is influencing research across many disciplines and also coming into the limelight of education, mostly thanks to public initiatives such as the Hour of Code. In this paper we present our arguments for promoting Computational Thinking in education through the Human-centred paradigm of Tangible End-User Development, namely by exploiting objects whose interactions with the physical environment are mapped to digital actions performed on the system.

  15. Fractional-Fourier-domain weighted Wigner distribution

    NARCIS (Netherlands)

    Stankovic, L.; Alieva, T.; Bastiaans, M.J.

    2001-01-01

    A fractional-Fourier-domain realization of the weighted Wigner distribution (or S-method), producing auto-terms close to the ones in the Wigner distribution itself, but with reduced cross-terms, is presented. The computational cost of this fractional-domain realization is the same as the

  16. Domain decomposition methods for the neutron diffusion problem

    International Nuclear Information System (INIS)

    Guerin, P.; Baudron, A. M.; Lautard, J. J.

    2010-01-01

    The neutronic simulation of a nuclear reactor core is performed using the neutron transport equation, and leads to an eigenvalue problem in the steady-state case. Among the deterministic resolution methods, simplified transport (SPN) or diffusion approximations are often used. The MINOS solver developed at CEA Saclay uses a mixed dual finite element method for the resolution of these problems. and has shown his efficiency. In order to take into account the heterogeneities of the geometry, a very fine mesh is generally required, and leads to expensive calculations for industrial applications. In order to take advantage of parallel computers, and to reduce the computing time and the local memory requirement, we propose here two domain decomposition methods based on the MINOS solver. The first approach is a component mode synthesis method on overlapping sub-domains: several Eigenmodes solutions of a local problem on each sub-domain are taken as basis functions used for the resolution of the global problem on the whole domain. The second approach is an iterative method based on a non-overlapping domain decomposition with Robin interface conditions. At each iteration, we solve the problem on each sub-domain with the interface conditions given by the solutions on the adjacent sub-domains estimated at the previous iteration. Numerical results on parallel computers are presented for the diffusion model on realistic 2D and 3D cores. (authors)

  17. Research foci of computing research in South Africa as reflected by publications in the South African computer journal

    CSIR Research Space (South Africa)

    Kotzé, P

    2009-01-01

    Full Text Available of research articles published in SACJ over its first 40 volumes of the journal using the ACM Computing Classification Scheme as basis. In their analysis the authors divided the publications into three cycles of more or less six years in order to identify...

  18. Sierra toolkit computational mesh conceptual model

    International Nuclear Information System (INIS)

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-01-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  19. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    OpenAIRE

    Karlheinz Schwarz; Rainer Breitling; Christian Allen

    2013-01-01

    Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...

  20. CLOUD COMPUTING SECURITY

    Directory of Open Access Journals (Sweden)

    Ştefan IOVAN

    2016-05-01

    Full Text Available Cloud computing reprentes the software applications offered as a service online, but also the software and hardware components from the data center.In the case of wide offerd services for any type of client, we are dealing with a public cloud. In the other case, in wich a cloud is exclusively available for an organization and is not available to the open public, this is consider a private cloud [1]. There is also a third type, called hibrid in which case an user or an organization might use both services available in the public and private cloud. One of the main challenges of cloud computing are to build the trust and ofer information privacy in every aspect of service offerd by cloud computingle. The variety of existing standards, just like the lack of clarity in sustenability certificationis not a real help in building trust. Also appear some questions marks regarding the efficiency of traditionsecurity means that are applied in the cloud domain. Beside the economic and technology advantages offered by cloud, also are some advantages in security area if the information is migrated to cloud. Shared resources available in cloud includes the survey, use of the "best practices" and technology for advance security level, above all the solutions offered by the majority of medium and small businesses, big companies and even some guvermental organizations [2].

  1. Domain wall networks on solitons

    International Nuclear Information System (INIS)

    Sutcliffe, Paul

    2003-01-01

    Domain wall networks on the surface of a soliton are studied in a simple theory. It consists of two complex scalar fields, in 3+1 dimensions, with a global U(1)xZ n symmetry, where n>2. Solutions are computed numerically in which one of the fields forms a Q ball and the other field forms a network of domain walls localized on the surface of the Q ball. Examples are presented in which the domain walls lie along the edges of a spherical polyhedron, forming junctions at its vertices. It is explained why only a small restricted class of polyhedra can arise as domain wall networks

  2. New Inversion and Interpretation of Public-Domain Electromagnetic Survey Data from Selected Areas in Alaska

    Science.gov (United States)

    Smith, B. D.; Kass, A.; Saltus, R. W.; Minsley, B. J.; Deszcz-Pan, M.; Bloss, B. R.; Burns, L. E.

    2013-12-01

    Public-domain airborne geophysical surveys (combined electromagnetics and magnetics), mostly collected for and released by the State of Alaska, Division of Geological and Geophysical Surveys (DGGS), are a unique and valuable resource for both geologic interpretation and geophysical methods development. A new joint effort by the US Geological Survey (USGS) and the DGGS aims to add value to these data through the application of novel advanced inversion methods and through innovative and intuitive display of data: maps, profiles, voxel-based models, and displays of estimated inversion quality and confidence. Our goal is to make these data even more valuable for interpretation of geologic frameworks, geotechnical studies, and cryosphere studies, by producing robust estimates of subsurface resistivity that can be used by non-geophysicists. The available datasets, which are available in the public domain, include 39 frequency-domain electromagnetic datasets collected since 1993, and continue to grow with 5 more data releases pending in 2013. The majority of these datasets were flown for mineral resource purposes, with one survey designed for infrastructure analysis. In addition, several USGS datasets are included in this study. The USGS has recently developed new inversion methodologies for airborne EM data and have begun to apply these and other new techniques to the available datasets. These include a trans-dimensional Markov Chain Monte Carlo technique, laterally-constrained regularized inversions, and deterministic inversions which include calibration factors as a free parameter. Incorporation of the magnetic data as an additional constraining dataset has also improved the inversion results. Processing has been completed in several areas, including Fortymile and the Alaska Highway surveys, and continues in others such as the Styx River and Nome surveys. Utilizing these new techniques, we provide models beyond the apparent resistivity maps supplied by the original

  3. Time-Domain Terahertz Computed Axial Tomography NDE System

    Science.gov (United States)

    Zimdars, David

    2012-01-01

    NASA has identified the need for advanced non-destructive evaluation (NDE) methods to characterize aging and durability in aircraft materials to improve the safety of the nation's airline fleet. 3D THz tomography can play a major role in detection and characterization of flaws and degradation in aircraft materials, including Kevlar-based composites and Kevlar and Zylon fabric covers for soft-shell fan containment where aging and durability issues are critical. A prototype computed tomography (CT) time-domain (TD) THz imaging system has been used to generate 3D images of several test objects including a TUFI tile (a thermal protection system tile used on the Space Shuttle and possibly the Orion or similar capsules). This TUFI tile had simulated impact damage that was located and the depth of damage determined. The CT motion control gan try was designed and constructed, and then integrated with a T-Ray 4000 control unit and motion controller to create a complete CT TD-THz imaging system prototype. A data collection software script was developed that takes multiple z-axis slices in sequence and saves the data for batch processing. The data collection software was integrated with the ability to batch process the slice data with the CT TD-THz image reconstruction software. The time required to take a single CT slice was decreased from six minutes to approximately one minute by replacing the 320 ps, 100-Hz waveform acquisition system with an 80 ps, 1,000-Hz waveform acquisition system. The TD-THZ computed tomography system was built from pre-existing commercial off-the-shelf subsystems. A CT motion control gantry was constructed from COTS components that can handle larger samples. The motion control gantry allows inspection of sample sizes of up to approximately one cubic foot (.0.03 cubic meters). The system reduced to practice a CT-TDTHz system incorporating a COTS 80- ps/l-kHz waveform scanner. The incorporation of this scanner in the system allows acquisition of 3D

  4. Informed public choices for low-carbon electricity portfolios using a computer decision tool.

    Science.gov (United States)

    Mayer, Lauren A Fleishman; Bruine de Bruin, Wändi; Morgan, M Granger

    2014-04-01

    Reducing CO2 emissions from the electricity sector will likely require policies that encourage the widespread deployment of a diverse mix of low-carbon electricity generation technologies. Public discourse informs such policies. To make informed decisions and to productively engage in public discourse, citizens need to understand the trade-offs between electricity technologies proposed for widespread deployment. Building on previous paper-and-pencil studies, we developed a computer tool that aimed to help nonexperts make informed decisions about the challenges faced in achieving a low-carbon energy future. We report on an initial usability study of this interactive computer tool. After providing participants with comparative and balanced information about 10 electricity technologies, we asked them to design a low-carbon electricity portfolio. Participants used the interactive computer tool, which constrained portfolio designs to be realistic and yield low CO2 emissions. As they changed their portfolios, the tool updated information about projected CO2 emissions, electricity costs, and specific environmental impacts. As in the previous paper-and-pencil studies, most participants designed diverse portfolios that included energy efficiency, nuclear, coal with carbon capture and sequestration, natural gas, and wind. Our results suggest that participants understood the tool and used it consistently. The tool may be downloaded from http://cedmcenter.org/tools-for-cedm/informing-the-public-about-low-carbon-technologies/ .

  5. Sources and Resources Into the Dark Domain: The UK Web Archive as a Source for the Contemporary History of Public Health.

    Science.gov (United States)

    Gorsky, Martin

    2015-08-01

    With the migration of the written record from paper to digital format, archivists and historians must urgently consider how web content should be conserved, retrieved and analysed. The British Library has recently acquired a large number of UK domain websites, captured 1996-2010, which is colloquially termed the Dark Domain Archive while technical issues surrounding user access are resolved. This article reports the results of an invited pilot project that explores methodological issues surrounding use of this archive. It asks how the relationship between UK public health and local government was represented on the web, drawing on the 'declinist' historiography to frame its questions. It points up some difficulties in developing an aggregate picture of web content due to duplication of sites. It also highlights their potential for thematic and discourse analysis, using both text and image, illustrated through an argument about the contradictory rationale for public health policy under New Labour.

  6. Three-dimensional transient electromagnetic modeling in the Laplace Domain

    International Nuclear Information System (INIS)

    Mizunaga, H.; Lee, Ki Ha; Kim, H.J.

    1998-01-01

    In modeling electromagnetic responses, Maxwell's equations in the frequency domain are popular and have been widely used (Nabighian, 1994; Newman and Alumbaugh, 1995; Smith, 1996, to list a few). Recently, electromagnetic modeling in the time domain using the finite difference (FDTD) method (Wang and Hohmann, 1993) has also been used to study transient electromagnetic interactions in the conductive medium. This paper presents a new technique to compute the electromagnetic response of three-dimensional (3-D) structures. The proposed new method is based on transforming Maxwell's equations to the Laplace domain. For each discrete Laplace variable, Maxwell's equations are discretized in 3-D using the staggered grid and the finite difference method (FDM). The resulting system of equations is then solved for the fields using the incomplete Cholesky conjugate gradient (ICCG) method. The new method is particularly effective in saving computer memory since all the operations are carried out in real numbers. For the same reason, the computing speed is faster than frequency domain modeling. The proposed approach can be an extremely useful tool in developing an inversion algorithm using the time domain data

  7. Parallel finite elements with domain decomposition and its pre-processing

    International Nuclear Information System (INIS)

    Yoshida, A.; Yagawa, G.; Hamada, S.

    1993-01-01

    This paper describes a parallel finite element analysis using a domain decomposition method, and the pre-processing for the parallel calculation. Computer simulations are about to replace experiments in various fields, and the scale of model to be simulated tends to be extremely large. On the other hand, computational environment has drastically changed in these years. Especially, parallel processing on massively parallel computers or computer networks is considered to be promising techniques. In order to achieve high efficiency on such parallel computation environment, large granularity of tasks, a well-balanced workload distribution are key issues. It is also important to reduce the cost of pre-processing in such parallel FEM. From the point of view, the authors developed the domain decomposition FEM with the automatic and dynamic task-allocation mechanism and the automatic mesh generation/domain subdivision system for it. (author)

  8. Predicting detection performance with model observers: Fourier domain or spatial domain?

    Science.gov (United States)

    Chen, Baiyu; Yu, Lifeng; Leng, Shuai; Kofler, James; Favazza, Christopher; Vrieze, Thomas; McCollough, Cynthia

    2016-02-27

    The use of Fourier domain model observer is challenged by iterative reconstruction (IR), because IR algorithms are nonlinear and IR images have noise texture different from that of FBP. A modified Fourier domain model observer, which incorporates nonlinear noise and resolution properties, has been proposed for IR and needs to be validated with human detection performance. On the other hand, the spatial domain model observer is theoretically applicable to IR, but more computationally intensive than the Fourier domain method. The purpose of this study is to compare the modified Fourier domain model observer to the spatial domain model observer with both FBP and IR images, using human detection performance as the gold standard. A phantom with inserts of various low contrast levels and sizes was repeatedly scanned 100 times on a third-generation, dual-source CT scanner at 5 dose levels and reconstructed using FBP and IR algorithms. The human detection performance of the inserts was measured via a 2-alternative-forced-choice (2AFC) test. In addition, two model observer performances were calculated, including a Fourier domain non-prewhitening model observer and a spatial domain channelized Hotelling observer. The performance of these two mode observers was compared in terms of how well they correlated with human observer performance. Our results demonstrated that the spatial domain model observer correlated well with human observers across various dose levels, object contrast levels, and object sizes. The Fourier domain observer correlated well with human observers using FBP images, but overestimated the detection performance using IR images.

  9. Online self-report questionnaire on computer work-related exposure (OSCWE): validity and internal consistency.

    Science.gov (United States)

    Mekhora, Keerin; Jalayondeja, Wattana; Jalayondeja, Chutima; Bhuanantanondh, Petcharatana; Dusadiisariyavong, Asadang; Upiriyasakul, Rujiret; Anuraktam, Khajornyod

    2014-07-01

    To develop an online, self-report questionnaire on computer work-related exposure (OSCWE) and to determine the internal consistency, face and content validity of the questionnaire. The online, self-report questionnaire was developed to determine the risk factors related to musculoskeletal disorders in computer users. It comprised five domains: personal, work-related, work environment, physical health and psychosocial factors. The questionnaire's content was validated by an occupational medical doctor and three physical therapy lecturers involved in ergonomic teaching. Twenty-five lay people examined the feasibility of computer-administered and the user-friendly language. The item correlation in each domain was analyzed by the internal consistency (Cronbach's alpha; alpha). The content of the questionnaire was considered congruent with the testing purposes. Eight hundred and thirty-five computer users at the PTT Exploration and Production Public Company Limited registered to the online self-report questionnaire. The internal consistency of the five domains was: personal (alpha = 0.58), work-related (alpha = 0.348), work environment (alpha = 0.72), physical health (alpha = 0.68) and psychosocial factor (alpha = 0.93). The findings suggested that the OSCWE had acceptable internal consistency for work environment and psychosocial factors. The OSCWE is available to use in population-based survey research among computer office workers.

  10. Finding the Secret of Image Saliency in the Frequency Domain.

    Science.gov (United States)

    Li, Jia; Duan, Ling-Yu; Chen, Xiaowu; Huang, Tiejun; Tian, Yonghong

    2015-12-01

    There are two sides to every story of visual saliency modeling in the frequency domain. On the one hand, image saliency can be effectively estimated by applying simple operations to the frequency spectrum. On the other hand, it is still unclear which part of the frequency spectrum contributes the most to popping-out targets and suppressing distractors. Toward this end, this paper tentatively explores the secret of image saliency in the frequency domain. From the results obtained in several qualitative and quantitative experiments, we find that the secret of visual saliency may mainly hide in the phases of intermediate frequencies. To explain this finding, we reinterpret the concept of discrete Fourier transform from the perspective of template-based contrast computation and thus develop several principles for designing the saliency detector in the frequency domain. Following these principles, we propose a novel approach to design the saliency detector under the assistance of prior knowledge obtained through both unsupervised and supervised learning processes. Experimental results on a public image benchmark show that the learned saliency detector outperforms 18 state-of-the-art approaches in predicting human fixations.

  11. Computational domain length and Reynolds number effects on large-scale coherent motions in turbulent pipe flow

    Science.gov (United States)

    Feldmann, Daniel; Bauer, Christian; Wagner, Claus

    2018-03-01

    We present results from direct numerical simulations (DNS) of turbulent pipe flow at shear Reynolds numbers up to Reτ = 1500 using different computational domains with lengths up to ?. The objectives are to analyse the effect of the finite size of the periodic pipe domain on large flow structures in dependency of Reτ and to assess a minimum ? required for relevant turbulent scales to be captured and a minimum Reτ for very large-scale motions (VLSM) to be analysed. Analysing one-point statistics revealed that the mean velocity profile is invariant for ?. The wall-normal location at which deviations occur in shorter domains changes strongly with increasing Reτ from the near-wall region to the outer layer, where VLSM are believed to live. The root mean square velocity profiles exhibit domain length dependencies for pipes shorter than 14R and 7R depending on Reτ. For all Reτ, the higher-order statistical moments show only weak dependencies and only for the shortest domain considered here. However, the analysis of one- and two-dimensional pre-multiplied energy spectra revealed that even for larger ?, not all physically relevant scales are fully captured, even though the aforementioned statistics are in good agreement with the literature. We found ? to be sufficiently large to capture VLSM-relevant turbulent scales in the considered range of Reτ based on our definition of an integral energy threshold of 10%. The requirement to capture at least 1/10 of the global maximum energy level is justified by a 14% increase of the streamwise turbulence intensity in the outer region between Reτ = 720 and 1500, which can be related to VLSM-relevant length scales. Based on this scaling anomaly, we found Reτ⪆1500 to be a necessary minimum requirement to investigate VLSM-related effects in pipe flow, even though the streamwise energy spectra does not yet indicate sufficient scale separation between the most energetic and the very long motions.

  12. Coping with Computer Viruses: General Discussion and Review of Symantec Anti-Virus for the Macintosh.

    Science.gov (United States)

    Primich, Tracy

    1992-01-01

    Discusses computer viruses that attack the Macintosh and describes Symantec AntiVirus for Macintosh (SAM), a commercial program designed to detect and eliminate viruses; sample screen displays are included. SAM is recommended for use in library settings as well as two public domain virus protection programs. (four references) (MES)

  13. Self-consistent field theory simulations of polymers on arbitrary domains

    Energy Technology Data Exchange (ETDEWEB)

    Ouaknin, Gaddiel, E-mail: gaddielouaknin@umail.ucsb.edu [Department of Mechanical Engineering, University of California, Santa Barbara, CA 93106-5070 (United States); Laachi, Nabil; Delaney, Kris [Materials Research Laboratory, University of California, Santa Barbara, CA 93106-5080 (United States); Fredrickson, Glenn H. [Materials Research Laboratory, University of California, Santa Barbara, CA 93106-5080 (United States); Department of Chemical Engineering, University of California, Santa Barbara, CA 93106-5080 (United States); Department of Materials, University of California, Santa Barbara, CA 93106-5050 (United States); Gibou, Frederic [Department of Mechanical Engineering, University of California, Santa Barbara, CA 93106-5070 (United States); Department of Computer Science, University of California, Santa Barbara, CA 93106-5110 (United States)

    2016-12-15

    We introduce a framework for simulating the mesoscale self-assembly of block copolymers in arbitrary confined geometries subject to Neumann boundary conditions. We employ a hybrid finite difference/volume approach to discretize the mean-field equations on an irregular domain represented implicitly by a level-set function. The numerical treatment of the Neumann boundary conditions is sharp, i.e. it avoids an artificial smearing in the irregular domain boundary. This strategy enables the study of self-assembly in confined domains and enables the computation of physically meaningful quantities at the domain interface. In addition, we employ adaptive grids encoded with Quad-/Oc-trees in parallel to automatically refine the grid where the statistical fields vary rapidly as well as at the boundary of the confined domain. This approach results in a significant reduction in the number of degrees of freedom and makes the simulations in arbitrary domains using effective boundary conditions computationally efficient in terms of both speed and memory requirement. Finally, in the case of regular periodic domains, where pseudo-spectral approaches are superior to finite differences in terms of CPU time and accuracy, we use the adaptive strategy to store chain propagators, reducing the memory footprint without loss of accuracy in computed physical observables.

  14. Assessing Tax Form Distribution Costs: A Proposed Method for Computing the Dollar Value of Tax Form Distribution in a Public Library.

    Science.gov (United States)

    Casey, James B.

    1998-01-01

    Explains how a public library can compute the actual cost of distributing tax forms to the public by listing all direct and indirect costs and demonstrating the formulae and necessary computations. Supplies directions for calculating costs involved for all levels of staff as well as associated public relations efforts, space, and utility costs.…

  15. Astrocyte mega-domain hypothesis of the autistic savantism.

    Science.gov (United States)

    Mitterauer, Bernhard J

    2013-01-01

    Individuals with autism who show high abilities are called savants. Whereas in their brains a disconnection in and between neural networks has been identified, savantism is yet poorly understood. Focusing on astrocyte domain organization, it is hypothesized that local astrocyte mega-organizations may be responsible for exerting high capabilities in brains of autistic savants. Astrocytes, the dominant glial cell type, modulate synaptic information transmission. Each astrocyte is organized in non-overlapping domains. Formally, each astrocyte contacting n-neurons with m-synapses via its processes generates dynamic domains of synaptic interactions based on qualitative computation criteria, and hereby it structures neuronal information processing. If the number of processes is genetically significantly increased, these astrocytes operate in a mega-domain with a higher complexitiy of computation. From this model savant abilities are deduced. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Polyhedral meshing as an innovative approach to computational domain discretization of a cyclone in a fluidized bed CLC unit

    Directory of Open Access Journals (Sweden)

    Sosnowski Marcin

    2017-01-01

    Full Text Available Chemical Looping Combustion (CLC is a technology that allows the separation of CO2, which is generated by the combustion of fossil fuels. The majority of process designs currently under investigation are systems of coupled fluidized beds. Advances in the development of power generation system using CLC cannot be introduced without using numerical modelling as a research tool. The primary and critical activity in numerical modelling is the computational domain discretization. It influences the numerical diffusion as well as convergence of the model and therefore the overall accuracy of the obtained results. Hence an innovative approach of computational domain discretization using polyhedral (POLY mesh is proposed in the paper. This method reduces both the numerical diffusion of the mesh as well as the time cost of preparing the model for subsequent calculation. The major advantage of POLY mesh is that each individual cell has many neighbours, so gradients can be much better approximated in comparison to commonly-used tetrahedral (TET mesh. POLYs are also less sensitive to stretching than TETs which results in better numerical stability of the model. Therefore detailed comparison of numerical modelling results concerning subsection of CLC system using tetrahedral and polyhedral mesh is covered in the paper.

  17. The effect of finite-difference time-domain resolution and power-loss computation method on SAR values in plane-wave exposure of Zubal phantom

    International Nuclear Information System (INIS)

    Uusitupa, T M; Ilvonen, S A; Laakso, I M; Nikoskinen, K I

    2008-01-01

    In this paper, the anatomically realistic body model Zubal is exposed to a plane wave. A finite-difference time-domain (FDTD) method is used to obtain field data for specific-absorption-rate (SAR) computation. It is investigated how the FDTD resolution, power-loss computation method and positioning of the material voxels in the FDTD grid affect the SAR results. The results enable one to estimate the effects due to certain fundamental choices made in the SAR simulation

  18. Leveraging Cloud Computing to Address Public Health Disparities: An Analysis of the SPHPS.

    Science.gov (United States)

    Jalali, Arash; Olabode, Olusegun A; Bell, Christopher M

    2012-01-01

    As the use of certified electronic health record technology (CEHRT) has continued to gain prominence in hospitals and physician practices, public health agencies and health professionals have the ability to access health data through health information exchanges (HIE). With such knowledge health providers are well positioned to positively affect population health, and enhance health status or quality-of-life outcomes in at-risk populations. Through big data analytics, predictive analytics and cloud computing, public health agencies have the opportunity to observe emerging public health threats in real-time and provide more effective interventions addressing health disparities in our communities. The Smarter Public Health Prevention System (SPHPS) provides real-time reporting of potential public health threats to public health leaders through the use of a simple and efficient dashboard and links people with needed personal health services through mobile platforms for smartphones and tablets to promote and encourage healthy behaviors in our communities. The purpose of this working paper is to evaluate how a secure virtual private cloud (VPC) solution could facilitate the implementation of the SPHPS in order to address public health disparities.

  19. Shapes of isolated domains and field induced evolution of regular and random 2D domain structures in LiNbO3 and LiTaO3

    International Nuclear Information System (INIS)

    Chernykh, A.; Shur, V.; Nikolaeva, E.; Shishkin, E.; Shur, A.; Terabe, K.; Kurimura, S.; Kitamura, K.; Gallo, K.

    2005-01-01

    The variety of the shapes of isolated domains, revealed in congruent and stoichiometric LiTaO 3 and LiNbO 3 by chemical etching and visualized by optical and scanning probe microscopy, was obtained by computer simulation. The kinetic nature of the domain shape was clearly demonstrated. The kinetics of domain structure with the dominance of the growth of the steps formed at the domain walls as a result of domain merging was investigated experimentally in slightly distorted artificial regular two-dimensional (2D) hexagonal domain structure and random natural one. The artificial structure has been realized in congruent LiNbO 3 by 2D electrode pattern produced by photolithography. The polarization reversal in congruent LiTaO 3 was investigated as an example of natural domain growth limited by merging. The switching process defined by domain merging was studied by computer simulation. The crucial dependence of the switching kinetics on the nuclei concentration has been revealed

  20. 6th International Workshop on Computer-Aided Scheduling of Public Transport

    CERN Document Server

    Branco, Isabel; Paixão, José

    1995-01-01

    This proceedings volume consists of papers presented at the Sixth International Workshop on Computer-Aided Scheduling of Public Transpon, which was held at the Fund~lio Calouste Gulbenkian in Lisbon from July 6th to 9th, 1993. In the tradition of alternating Workshops between North America and Europe - Chicago (1975), Leeds (1980), Montreal (1983), Hamburg (1987) and again Montreal (1990), the European city of Lisbon was selected as the venue for the Workshop in 1993. As in earlier Workshops, the central theme dealt with vehicle and duty scheduling problems and the employment of operations-research-based software systems for operational planning in public transport. However, as was initiated in Hamburg in 1987, the scope of this Workshop was broadened to include topics in related fields. This fundamental alteration was an inevitable consequence of the growing demand over the last decade for solutions to the complete planning process in public transport through integrated systems. Therefore, the program of thi...

  1. What do computer scientists tweet? Analyzing the link-sharing practice on Twitter.

    Directory of Open Access Journals (Sweden)

    Marco Schmitt

    Full Text Available Twitter communication has permeated every sphere of society. To highlight and share small pieces of information with possibly vast audiences or small circles of the interested has some value in almost any aspect of social life. But what is the value exactly for a scientific field? We perform a comprehensive study of computer scientists using Twitter and their tweeting behavior concerning the sharing of web links. Discerning the domains, hosts and individual web pages being tweeted and the differences between computer scientists and a Twitter sample enables us to look in depth at the Twitter-based information sharing practices of a scientific community. Additionally, we aim at providing a deeper understanding of the role and impact of altmetrics in computer science and give a glance at the publications mentioned on Twitter that are most relevant for the computer science community. Our results show a link sharing culture that concentrates more heavily on public and professional quality information than the Twitter sample does. The results also show a broad variety in linked sources and especially in linked publications with some publications clearly related to community-specific interests of computer scientists, while others with a strong relation to attention mechanisms in social media. This refers to the observation that Twitter is a hybrid form of social media between an information service and a social network service. Overall the computer scientists' style of usage seems to be more on the information-oriented side and to some degree also on professional usage. Therefore, altmetrics are of considerable use in analyzing computer science.

  2. What do computer scientists tweet? Analyzing the link-sharing practice on Twitter.

    Science.gov (United States)

    Schmitt, Marco; Jäschke, Robert

    2017-01-01

    Twitter communication has permeated every sphere of society. To highlight and share small pieces of information with possibly vast audiences or small circles of the interested has some value in almost any aspect of social life. But what is the value exactly for a scientific field? We perform a comprehensive study of computer scientists using Twitter and their tweeting behavior concerning the sharing of web links. Discerning the domains, hosts and individual web pages being tweeted and the differences between computer scientists and a Twitter sample enables us to look in depth at the Twitter-based information sharing practices of a scientific community. Additionally, we aim at providing a deeper understanding of the role and impact of altmetrics in computer science and give a glance at the publications mentioned on Twitter that are most relevant for the computer science community. Our results show a link sharing culture that concentrates more heavily on public and professional quality information than the Twitter sample does. The results also show a broad variety in linked sources and especially in linked publications with some publications clearly related to community-specific interests of computer scientists, while others with a strong relation to attention mechanisms in social media. This refers to the observation that Twitter is a hybrid form of social media between an information service and a social network service. Overall the computer scientists' style of usage seems to be more on the information-oriented side and to some degree also on professional usage. Therefore, altmetrics are of considerable use in analyzing computer science.

  3. Associations of Total and Domain-Specific Sedentary Time With Type 2 Diabetes in Taiwanese Older Adults

    Directory of Open Access Journals (Sweden)

    Ming-Chun Hsueh

    2016-07-01

    Full Text Available Background: The increasing prevalence of type 2 diabetes in older adults has become a public health concern. We investigated the associations of total and domain-specific sedentary time with risk of type 2 diabetes in older adults. Methods: The sample comprised 1046 older people (aged ≥65 years. Analyses were performed using crosssectional data collected via computer-assisted telephone-based interviews in 2014. Data on six self-reported domains of sedentary time (Measure of Older Adults’ Sedentary Time, type 2 diabetes status, and sociodemographic variables were included in the study. Binary logistic regression analysis was performed to calculate the adjusted odds ratios (ORs and 95% confidence intervals (CIs for total and individual sedentary behavior components and likelihood of type 2 diabetes. Results: A total of 17.5% of the participants reported type 2 diabetes. No significant associations were found between total sitting time and risk of type 2 diabetes, after controlling for confounding factors. After total sedentary behavior was stratified into six domains, only watching television for more than 2 hours per day was associated with higher odds of type 2 diabetes (OR 1.56; 95% CI, 1.10–2.21, but no significant associations were found between other domains of sedentary behavior (computer use, reading, socializing, transport, and hobbies and risk of type 2 diabetes. Conclusions: These findings suggest that, among domain-specific sedentary behavior, excessive television viewing might increase the risk of type 2 diabetes among older adults more than other forms of sedentary behavior.

  4. Knowing, Applying, and Reasoning about Arithmetic: Roles of Domain-General and Numerical Skills in Multiple Domains of Arithmetic Learning

    Science.gov (United States)

    Zhang, Xiao; Räsänen, Pekka; Koponen, Tuire; Aunola, Kaisa; Lerkkanen, Marja-Kristiina; Nurmi, Jari-Erik

    2017-01-01

    The longitudinal relations of domain-general and numerical skills at ages 6-7 years to 3 cognitive domains of arithmetic learning, namely knowing (written computation), applying (arithmetic word problems), and reasoning (arithmetic reasoning) at age 11, were examined for a representative sample of 378 Finnish children. The results showed that…

  5. FCJ-133 The Scripted Spaces of Urban Ubiquitous Computing: The experience, poetics, and politics of public scripted space

    Directory of Open Access Journals (Sweden)

    Christian Ulrik Andersen

    2011-12-01

    Full Text Available This article proposes and introduces the concept of ‘scripted space’ as a new perspective on ubiquitous computing in urban environments. Drawing on urban history, computer games, and a workshop study of the city of Lund the article discusses the experience of digitally scripted spaces, and their relation to the history of public spaces. In conclusion, the article discusses the potential for employing scripted spaces as a reinvigoration of urban public space.

  6. Casimir forces in the time domain: Theory

    International Nuclear Information System (INIS)

    Rodriguez, Alejandro W.; McCauley, Alexander P.; Joannopoulos, John D.; Johnson, Steven G.

    2009-01-01

    We present a method to compute Casimir forces in arbitrary geometries and for arbitrary materials based on the finite-difference time-domain (FDTD) scheme. The method involves the time evolution of electric and magnetic fields in response to a set of current sources, in a modified medium with frequency-independent conductivity. The advantage of this approach is that it allows one to exploit existing FDTD software, without modification, to compute Casimir forces. In this paper, we focus on the derivation, implementation choices, and essential properties of the time-domain algorithm, both considered analytically and illustrated in the simplest parallel-plate geometry.

  7. Applications of computational intelligence in nuclear reactors

    International Nuclear Information System (INIS)

    Jayalal, M.L.; Jehadeesan, R.

    2016-01-01

    Computational intelligence techniques have been successfully employed in a wide range of applications which include the domains of medical, bioinformatics, electronics, communications and business. There has been progress in applying of computational intelligence in the nuclear reactor domain during the last two decades. The stringent nuclear safety regulations pertaining to reactor environment present challenges in the application of computational intelligence in various nuclear sub-systems. The applications of various methods of computational intelligence in the domain of nuclear reactors are discussed in this paper. (author)

  8. A hybrid time-domain discontinuous galerkin-boundary integral method for electromagnetic scattering analysis

    KAUST Repository

    Li, Ping; Shi, Yifei; Jiang, Lijun; Bagci, Hakan

    2014-01-01

    A scheme hybridizing discontinuous Galerkin time-domain (DGTD) and time-domain boundary integral (TDBI) methods for accurately analyzing transient electromagnetic scattering is proposed. Radiation condition is enforced using the numerical flux on the truncation boundary. The fields required by the flux are computed using the TDBI from equivalent currents introduced on a Huygens' surface enclosing the scatterer. The hybrid DGTDBI ensures that the radiation condition is mathematically exact and the resulting computation domain is as small as possible since the truncation boundary conforms to scatterer's shape and is located very close to its surface. Locally truncated domains can also be defined around each disconnected scatterer additionally reducing the size of the overall computation domain. Numerical examples demonstrating the accuracy and versatility of the proposed method are presented. © 2014 IEEE.

  9. A hybrid time-domain discontinuous galerkin-boundary integral method for electromagnetic scattering analysis

    KAUST Repository

    Li, Ping

    2014-05-01

    A scheme hybridizing discontinuous Galerkin time-domain (DGTD) and time-domain boundary integral (TDBI) methods for accurately analyzing transient electromagnetic scattering is proposed. Radiation condition is enforced using the numerical flux on the truncation boundary. The fields required by the flux are computed using the TDBI from equivalent currents introduced on a Huygens\\' surface enclosing the scatterer. The hybrid DGTDBI ensures that the radiation condition is mathematically exact and the resulting computation domain is as small as possible since the truncation boundary conforms to scatterer\\'s shape and is located very close to its surface. Locally truncated domains can also be defined around each disconnected scatterer additionally reducing the size of the overall computation domain. Numerical examples demonstrating the accuracy and versatility of the proposed method are presented. © 2014 IEEE.

  10. DATA TRANSFER FROM A DEC PDP-11 BASED MASS-SPECTROMETRY DATA STATION TO AN MS-DOS PERSONAL-COMPUTER

    NARCIS (Netherlands)

    RAFFAELLI, A; BRUINS, AP

    This paper describes a simple procedure for obtaining better quality graphic output for mass spectrometry data from data systems equipped with poor quality printing devices. The procedure uses KERMIT, a low cost public domain software, to transfer ASCII tables to a MS-DOS personal computer where

  11. Finite difference time domain analysis of a chiro plasma

    International Nuclear Information System (INIS)

    Torres-Silva, H.; Obligado, A.; Reggiani, N.; Sakanaka, P.H.

    1995-01-01

    The finite difference time-domain (FDTD) method is one of the most widely used computational methods in electromagnetics. Using FDTD, Maxwell's equations are solved directly in the time domain via finite differences and time stepping. The basic approach is relatively easy to understand and is an alternative to the more usual frequency-domain approaches. (author). 5 refs

  12. RISC Processors and High Performance Computing

    Science.gov (United States)

    Bailey, David H.; Saini, Subhash; Craw, James M. (Technical Monitor)

    1995-01-01

    This tutorial will discuss the top five RISC microprocessors and the parallel systems in which they are used. It will provide a unique cross-machine comparison not available elsewhere. The effective performance of these processors will be compared by citing standard benchmarks in the context of real applications. The latest NAS Parallel Benchmarks, both absolute performance and performance per dollar, will be listed. The next generation of the NPB will be described. The tutorial will conclude with a discussion of future directions in the field. Technology Transfer Considerations: All of these computer systems are commercially available internationally. Information about these processors is available in the public domain, mostly from the vendors themselves. The NAS Parallel Benchmarks and their results have been previously approved numerous times for public release, beginning back in 1991.

  13. An Overview of Public Domain Tools for Measuring the Sustainability of Environmental Remediation - 12060

    Energy Technology Data Exchange (ETDEWEB)

    Claypool, John E.; Rogers, Scott [AECOM, Denver, Colorado, 80202 (United States)

    2012-07-01

    their clients. When it comes to the public domain, Federal government agencies are spearheading the development of software tools to measure and report emissions of air pollutants (e.g., carbon dioxide, other greenhouse gases, criteria air pollutants); consumption of energy, water and natural resources; accident and safety risks; project costs and other economic metrics. Most of the tools developed for the Government are available to environmental practitioners without charge, so they are growing in usage and popularity. The key features and metrics calculated by the available public-domain tools for measuring the sustainability of environmental remediation projects share some commonalities but there are differences amongst the tools. The SiteWise{sup TM} sustainability tool developed for the Navy and US Army will be compared with the Sustainable Remediation Tool (SRT{sup TM}) developed for the US Air Force (USAF). In addition, the USAF's Clean Solar and Wind Energy in Environmental Programs (CleanSWEEP), a soon-to-be-released tool for evaluating the economic feasibility of utilizing renewal energy for powering remediation systems will be described in the paper. (authors)

  14. Computational Identification of Genomic Features That Influence 3D Chromatin Domain Formation.

    Science.gov (United States)

    Mourad, Raphaël; Cuvier, Olivier

    2016-05-01

    Recent advances in long-range Hi-C contact mapping have revealed the importance of the 3D structure of chromosomes in gene expression. A current challenge is to identify the key molecular drivers of this 3D structure. Several genomic features, such as architectural proteins and functional elements, were shown to be enriched at topological domain borders using classical enrichment tests. Here we propose multiple logistic regression to identify those genomic features that positively or negatively influence domain border establishment or maintenance. The model is flexible, and can account for statistical interactions among multiple genomic features. Using both simulated and real data, we show that our model outperforms enrichment test and non-parametric models, such as random forests, for the identification of genomic features that influence domain borders. Using Drosophila Hi-C data at a very high resolution of 1 kb, our model suggests that, among architectural proteins, BEAF-32 and CP190 are the main positive drivers of 3D domain borders. In humans, our model identifies well-known architectural proteins CTCF and cohesin, as well as ZNF143 and Polycomb group proteins as positive drivers of domain borders. The model also reveals the existence of several negative drivers that counteract the presence of domain borders including P300, RXRA, BCL11A and ELK1.

  15. 76 FR 12397 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Bureau of the Public Debt (BPD...

    Science.gov (United States)

    2011-03-07

    ...; Computer Matching Program (SSA/ Bureau of the Public Debt (BPD))--Match Number 1038 AGENCY: Social Security... as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection... containing SSNs extracted from the Supplemental Security Record database. Exchanges for this computer...

  16. Domain walls at finite temperature

    International Nuclear Information System (INIS)

    Carvalho, C.A. de; Marques, G.C.; Silva, A.J. da; Ventura, I.

    1983-08-01

    It is suggested that the phase transition of lambda phi 4 theory as a function of temperature coincides with the spontaneous appearance of domain walls. Based on one-loop calculations, T sub(c) = 4M/√ lambda is estimated as the temperature for these domains to because energetically favored, to be compared with T sub(c) = 4.9M/√ lambda from effective potential calculations (which are performed directly in the broken phase). Domain walls, as well as other Types of fluctuations, disorder the system above T sub(c), leading to =0. The critical exponent for the specific heat above T sub(c) is computed; and α=2/3 + 0 (√ lambda) is obtained. (Author) [pt

  17. Wavefield Extrapolation in Pseudo-depth Domain

    KAUST Repository

    Ma, Xuxin

    2011-12-11

    Wave-equation based seismic migration and inversion tools are widely used by the energy industry to explore hydrocarbon and mineral resources. By design, most of these techniques simulate wave propagation in a space domain with the vertical axis being depth measured from the surface. Vertical depth is popular because it is a straightforward mapping of the subsurface space. It is, however, not computationally cost-effective because the wavelength changes with local elastic wave velocity, which in general increases with depth in the Earth. As a result, the sampling per wavelength also increases with depth. To avoid spatial aliasing in deep fast media, the seismic wave is oversampled in shallow slow media and therefore increase the total computation cost. This issue is effectively tackled by using the vertical time axis instead of vertical depth. This is because in a vertical time representation, the "wavelength" is essentially time period for vertical rays. This thesis extends the vertical time axis to the pseudo-depth axis, which features distance unit while preserving the properties of the vertical time representation. To explore the potentials of doing wave-equation based imaging in the pseudo-depth domain, a Partial Differential Equation (PDE) is derived to describe acoustic wave in this new domain. This new PDE is inherently anisotropic because the use of a constant vertical velocity to convert between depth and vertical time. Such anisotropy results in lower reflection coefficients compared with conventional space domain modeling results. This feature is helpful to suppress the low wavenumber artifacts in reverse-time migration images, which are caused by the widely used cross-correlation imaging condition. This thesis illustrates modeling acoustic waves in both conventional space domain and pseudo-depth domain. The numerical tool used to model acoustic waves is built based on the lowrank approximation of Fourier integral operators. To investigate the potential

  18. Helix Nebula: Enabling federation of existing data infrastructures and data services to an overarching cross-domain e-infrastructure

    Science.gov (United States)

    Lengert, Wolfgang; Farres, Jordi; Lanari, Riccardo; Casu, Francesco; Manunta, Michele; Lassalle-Balier, Gerard

    2014-05-01

    Helix Nebula has established a growing public private partnership of more than 30 commercial cloud providers, SMEs, and publicly funded research organisations and e-infrastructures. The Helix Nebula strategy is to establish a federated cloud service across Europe. Three high-profile flagships, sponsored by CERN (high energy physics), EMBL (life sciences) and ESA/DLR/CNES/CNR (earth science), have been deployed and extensively tested within this federated environment. The commitments behind these initial flagships have created a critical mass that attracts suppliers and users to the initiative, to work together towards an "Information as a Service" market place. Significant progress in implementing the following 4 programmatic goals (as outlined in the strategic Plan Ref.1) has been achieved: × Goal #1 Establish a Cloud Computing Infrastructure for the European Research Area (ERA) serving as a platform for innovation and evolution of the overall infrastructure. × Goal #2 Identify and adopt suitable policies for trust, security and privacy on a European-level can be provided by the European Cloud Computing framework and infrastructure. × Goal #3 Create a light-weight governance structure for the future European Cloud Computing Infrastructure that involves all the stakeholders and can evolve over time as the infrastructure, services and user-base grows. × Goal #4 Define a funding scheme involving the three stake-holder groups (service suppliers, users, EC and national funding agencies) into a Public-Private-Partnership model to implement a Cloud Computing Infrastructure that delivers a sustainable business environment adhering to European level policies. Now in 2014 a first version of this generic cross-domain e-infrastructure is ready to go into operations building on federation of European industry and contributors (data, tools, knowledge, ...). This presentation describes how Helix Nebula is being used in the domain of earth science focusing on geohazards. The

  19. Architecture for an advanced biomedical collaboration domain for the European paediatric cancer research community (ABCD-4-E).

    Science.gov (United States)

    Nitzlnader, Michael; Falgenhauer, Markus; Gossy, Christian; Schreier, Günter

    2015-01-01

    Today, progress in biomedical research often depends on large, interdisciplinary research projects and tailored information and communication technology (ICT) support. In the context of the European Network for Cancer Research in Children and Adolescents (ENCCA) project the exchange of data between data source (Source Domain) and data consumer (Consumer Domain) systems in a distributed computing environment needs to be facilitated. This work presents the requirements and the corresponding solution architecture of the Advanced Biomedical Collaboration Domain for Europe (ABCD-4-E). The proposed concept utilises public as well as private cloud systems, the Integrating the Healthcare Enterprise (IHE) framework and web-based applications to provide the core capabilities in accordance with privacy and security needs. The utility of crucial parts of the concept was evaluated by prototypic implementation. A discussion of the design indicates that the requirements of ENCCA are fully met. A whole system demonstration is currently being prepared to verify that ABCD-4-E has the potential to evolve into a domain-bridging collaboration platform in the future.

  20. A systemic domain model for ambient pervasive persuasive games

    OpenAIRE

    Eglin, Roger; Eyles, Mark; Dansey, Neil

    2008-01-01

    By the development of the system domain model it is hoped that a greater conceptual and theoretical clarity may be brought to understanding the complex and multifaceted nature of pervasive and ambient computer games. This paper presents a conceptual model, the system domain model, to illustrate domain areas that exist in a console, pervasive or ambient game. It is implicit that the regions that the systemic domain model describes are contextually dependent. By developing this model it is poss...

  1. Domain decomposition with local refinement for flow simulation around a nuclear waste disposal site: direct computation versus simulation using code coupling with OCamlP3L

    Energy Technology Data Exchange (ETDEWEB)

    Clement, F.; Vodicka, A.; Weis, P. [Institut National de Recherches Agronomiques (INRA), 78 - Le Chesnay (France); Martin, V. [Institut National de Recherches Agronomiques (INRA), 92 - Chetenay Malabry (France); Di Cosmo, R. [Institut National de Recherches Agronomiques (INRA), 78 - Le Chesnay (France); Paris-7 Univ., 75 (France)

    2003-07-01

    We consider the application of a non-overlapping domain decomposition method with non-matching grids based on Robin interface conditions to the problem of flow surrounding an underground nuclear waste disposal. We show with a simple example how one can refine the mesh locally around the storage with this technique. A second aspect is studied in this paper. The coupling between the sub-domains can be achieved by computing in two ways: either directly (i.e. the domain decomposition algorithm is included in the code that solves the problems on the sub-domains) or using code coupling. In the latter case, each sub-domain problem is solved separately and the coupling is performed by another program. We wrote a coupling program in the functional language Ocaml, using the OcamIP31 environment devoted to ease the parallelism. This at the same time we test the code coupling and we use the natural parallel property of domain decomposition methods. Some simple 2D numerical tests show promising results, and further studies are under way. (authors)

  2. Domain decomposition with local refinement for flow simulation around a nuclear waste disposal site: direct computation versus simulation using code coupling with OCamlP3L

    International Nuclear Information System (INIS)

    Clement, F.; Vodicka, A.; Weis, P.; Martin, V.; Di Cosmo, R.

    2003-01-01

    We consider the application of a non-overlapping domain decomposition method with non-matching grids based on Robin interface conditions to the problem of flow surrounding an underground nuclear waste disposal. We show with a simple example how one can refine the mesh locally around the storage with this technique. A second aspect is studied in this paper. The coupling between the sub-domains can be achieved by computing in two ways: either directly (i.e. the domain decomposition algorithm is included in the code that solves the problems on the sub-domains) or using code coupling. In the latter case, each sub-domain problem is solved separately and the coupling is performed by another program. We wrote a coupling program in the functional language Ocaml, using the OcamIP31 environment devoted to ease the parallelism. This at the same time we test the code coupling and we use the natural parallel property of domain decomposition methods. Some simple 2D numerical tests show promising results, and further studies are under way. (authors)

  3. Bregmanized Domain Decomposition for Image Restoration

    KAUST Repository

    Langer, Andreas

    2012-05-22

    Computational problems of large-scale data are gaining attention recently due to better hardware and hence, higher dimensionality of images and data sets acquired in applications. In the last couple of years non-smooth minimization problems such as total variation minimization became increasingly important for the solution of these tasks. While being favorable due to the improved enhancement of images compared to smooth imaging approaches, non-smooth minimization problems typically scale badly with the dimension of the data. Hence, for large imaging problems solved by total variation minimization domain decomposition algorithms have been proposed, aiming to split one large problem into N > 1 smaller problems which can be solved on parallel CPUs. The N subproblems constitute constrained minimization problems, where the constraint enforces the support of the minimizer to be the respective subdomain. In this paper we discuss a fast computational algorithm to solve domain decomposition for total variation minimization. In particular, we accelerate the computation of the subproblems by nested Bregman iterations. We propose a Bregmanized Operator Splitting-Split Bregman (BOS-SB) algorithm, which enforces the restriction onto the respective subdomain by a Bregman iteration that is subsequently solved by a Split Bregman strategy. The computational performance of this new approach is discussed for its application to image inpainting and image deblurring. It turns out that the proposed new solution technique is up to three times faster than the iterative algorithm currently used in domain decomposition methods for total variation minimization. © Springer Science+Business Media, LLC 2012.

  4. Calculus domains modelled using an original bool algebra based on polygons

    Science.gov (United States)

    Oanta, E.; Panait, C.; Raicu, A.; Barhalescu, M.; Axinte, T.

    2016-08-01

    Analytical and numerical computer based models require analytical definitions of the calculus domains. The paper presents a method to model a calculus domain based on a bool algebra which uses solid and hollow polygons. The general calculus relations of the geometrical characteristics that are widely used in mechanical engineering are tested using several shapes of the calculus domain in order to draw conclusions regarding the most effective methods to discretize the domain. The paper also tests the results of several CAD commercial software applications which are able to compute the geometrical characteristics, being drawn interesting conclusions. The tests were also targeting the accuracy of the results vs. the number of nodes on the curved boundary of the cross section. The study required the development of an original software consisting of more than 1700 computer code lines. In comparison with other calculus methods, the discretization using convex polygons is a simpler approach. Moreover, this method doesn't lead to large numbers as the spline approximation did, in that case being required special software packages in order to offer multiple, arbitrary precision. The knowledge resulted from this study may be used to develop complex computer based models in engineering.

  5. RISE OF BIOINFORMATICS AND COMPUTATIONAL BIOLOGY IN INDIA: A LOOK THROUGH PUBLICATIONS

    Directory of Open Access Journals (Sweden)

    Anjali Srivastava

    2017-09-01

    Full Text Available Computational biology and bioinformatics have been part and parcel of biomedical research for few decades now. However, the institutionalization of bioinformatics research took place with the establishment of Distributed Information Centres (DISCs in the research institutions of repute in various disciplines by the Department of Biotechnology, Government of India. Though, at initial stages, this endeavor was mainly focused on providing infrastructure for using information technology and internet based communication and tools for carrying out computational biology and in-silico assisted research in varied arena of research starting from disease biology to agricultural crops, spices, veterinary science and many more, the natural outcome of establishment of such facilities resulted into new experiments with bioinformatics tools. Thus, Biotechnology Information Systems (BTIS grew into a solid movement and a large number of publications started coming out of these centres. In the end of last century, bioinformatics started developing like a full-fledged research subject. In the last decade, a need was felt to actually make a factual estimation of the result of this endeavor of DBT which had, by then, established about two hundred centres in almost all disciplines of biomedical research. In a bid to evaluate the efforts and outcome of these centres, BTIS Centre at CSIR-CDRI, Lucknow was entrusted with collecting and collating the publications of these centres. However, when the full data was compiled, the DBT task force felt that the study must include Non-BTIS centres also so as to expand the report to have a glimpse of bioinformatics publications from the country.

  6. N-Terminal Domains in Two-Domain Proteins Are Biased to Be Shorter and Predicted to Fold Faster Than Their C-Terminal Counterparts

    Directory of Open Access Journals (Sweden)

    Etai Jacob

    2013-04-01

    Full Text Available Computational analysis of proteomes in all kingdoms of life reveals a strong tendency for N-terminal domains in two-domain proteins to have shorter sequences than their neighboring C-terminal domains. Given that folding rates are affected by chain length, we asked whether the tendency for N-terminal domains to be shorter than their neighboring C-terminal domains reflects selection for faster-folding N-terminal domains. Calculations of absolute contact order, another predictor of folding rate, provide additional evidence that N-terminal domains tend to fold faster than their neighboring C-terminal domains. A possible explanation for this bias, which is more pronounced in prokaryotes than in eukaryotes, is that faster folding of N-terminal domains reduces the risk for protein aggregation during folding by preventing formation of nonnative interdomain interactions. This explanation is supported by our finding that two-domain proteins with a shorter N-terminal domain are much more abundant than those with a shorter C-terminal domain.

  7. Radiological emergencies due to postulated events of melted radioactive material mixed in steel reaching public domain

    International Nuclear Information System (INIS)

    Meena, T.R.; Anoj Kumar; Patra, R.P.; Vikas; Patil, S.S.; Chatterjee, M.K.; Sharma, Ranjit; Murali, S.

    2014-01-01

    National level response mechanism is developed at emergency response centres of DAE (DAE-ERCs) at 22 different locations spread all over the country and National Disaster Response Forces with National Disaster Management Authority (NDMA). ERCs are equipped with radiation monitors, radionuclide identifinders, Personnel Radiation Dosimeters (PRD) with monitoring capabilities of the order of tens of nGy/h (μR/hr) above the radiation background at any suspected locations. Even if small amounts of radioactive material is smuggled and brought in some other form into public domain, ERCs are capable to detect, identify and segregate the radioactive material from any inactive scrap. DAE-ERCs have demonstrated their capability in source search, detection, identification and recovery during the radiological emergency at Mayapuri, New Delhi

  8. Radiological emergencies due to postulated events of melted radioactive material mixed in steel reaching public domain

    Energy Technology Data Exchange (ETDEWEB)

    Meena, T. R.; Kumar, Anoj; Patra, R. P.; Vikas,; Patil, S. S.; Chatterjee, M. K.; Sharma, Ranjit; Murali, S., E-mail: tejram@barc.gov.in [Radiation Safety Systems Division, Bhabha Atomic Research Centre, Mumbai (India)

    2014-07-01

    National level response mechanism is developed at emergency response centres of DAE (DAE-ERCs) at 22 different locations spread all over the country and National Disaster Response Forces with National Disaster Management Authority (NDMA). ERCs are equipped with radiation monitors, radionuclide identifinders, Personnel Radiation Dosimeters (PRD) with monitoring capabilities of the order of tens of nGy/h (μR/hr) above the radiation background at any suspected locations. Even if small amounts of radioactive material is smuggled and brought in some other form into public domain, ERCs are capable to detect, identify and segregate the radioactive material from any inactive scrap. DAE-ERCs have demonstrated their capability in source search, detection, identification and recovery during the radiological emergency at Mayapuri, New Delhi.

  9. Technique for designing a domain ontology

    OpenAIRE

    Palagin, A. V.; Petrenko, N. G.; Malakhov, K. S.

    2018-01-01

    The article describes the technique for designing a domain ontology, shows the flowchart of algorithm design and example of constructing a fragment of the ontology of the subject area of Computer Science is considered.

  10. Structure problems in the analog computation

    International Nuclear Information System (INIS)

    Braffort, P.L.

    1957-01-01

    The recent mathematical development showed the importance of elementary structures (algebraic, topological, etc.) in abeyance under the great domains of classical analysis. Such structures in analog computation are put in evidence and possible development of applied mathematics are discussed. It also studied the topological structures of the standard representation of analog schemes such as additional triangles, integrators, phase inverters and functions generators. The analog method gives only the function of the variable: time, as results of its computations. But the course of computation, for systems including reactive circuits, introduces order structures which are called 'chronological'. Finally, it showed that the approximation methods of ordinary numerical and digital computation present the same structure as these analog computation. The structure analysis permits fruitful comparisons between the several domains of applied mathematics and suggests new important domains of application for analog method. (M.P.)

  11. A user-friendly SSVEP-based brain-computer interface using a time-domain classifier.

    Science.gov (United States)

    Luo, An; Sullivan, Thomas J

    2010-04-01

    We introduce a user-friendly steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI) system. Single-channel EEG is recorded using a low-noise dry electrode. Compared to traditional gel-based multi-sensor EEG systems, a dry sensor proves to be more convenient, comfortable and cost effective. A hardware system was built that displays four LED light panels flashing at different frequencies and synchronizes with EEG acquisition. The visual stimuli have been carefully designed such that potential risk to photosensitive people is minimized. We describe a novel stimulus-locked inter-trace correlation (SLIC) method for SSVEP classification using EEG time-locked to stimulus onsets. We studied how the performance of the algorithm is affected by different selection of parameters. Using the SLIC method, the average light detection rate is 75.8% with very low error rates (an 8.4% false positive rate and a 1.3% misclassification rate). Compared to a traditional frequency-domain-based method, the SLIC method is more robust (resulting in less annoyance to the users) and is also suitable for irregular stimulus patterns.

  12. Exploring the Deep-Level Reasoning Questions Effect during Vicarious Learning among Eighth to Eleventh Graders in the Domains of Computer Literacy and Newtonian Physics

    Science.gov (United States)

    Gholson, Barry; Witherspoon, Amy; Morgan, Brent; Brittingham, Joshua K.; Coles, Robert; Graesser, Arthur C.; Sullins, Jeremiah; Craig, Scotty D.

    2009-01-01

    This paper tested the deep-level reasoning questions effect in the domains of computer literacy between eighth and tenth graders and Newtonian physics for ninth and eleventh graders. This effect claims that learning is facilitated when the materials are organized around questions that invite deep-reasoning. The literature indicates that vicarious…

  13. Cloud identification using genetic algorithms and massively parallel computation

    Science.gov (United States)

    Buckles, Bill P.; Petry, Frederick E.

    1996-01-01

    As a Guest Computational Investigator under the NASA administered component of the High Performance Computing and Communication Program, we implemented a massively parallel genetic algorithm on the MasPar SIMD computer. Experiments were conducted using Earth Science data in the domains of meteorology and oceanography. Results obtained in these domains are competitive with, and in most cases better than, similar problems solved using other methods. In the meteorological domain, we chose to identify clouds using AVHRR spectral data. Four cloud speciations were used although most researchers settle for three. Results were remarkedly consistent across all tests (91% accuracy). Refinements of this method may lead to more timely and complete information for Global Circulation Models (GCMS) that are prevalent in weather forecasting and global environment studies. In the oceanographic domain, we chose to identify ocean currents from a spectrometer having similar characteristics to AVHRR. Here the results were mixed (60% to 80% accuracy). Given that one is willing to run the experiment several times (say 10), then it is acceptable to claim the higher accuracy rating. This problem has never been successfully automated. Therefore, these results are encouraging even though less impressive than the cloud experiment. Successful conclusion of an automated ocean current detection system would impact coastal fishing, naval tactics, and the study of micro-climates. Finally we contributed to the basic knowledge of GA (genetic algorithm) behavior in parallel environments. We developed better knowledge of the use of subpopulations in the context of shared breeding pools and the migration of individuals. Rigorous experiments were conducted based on quantifiable performance criteria. While much of the work confirmed current wisdom, for the first time we were able to submit conclusive evidence. The software developed under this grant was placed in the public domain. An extensive user

  14. Improving developer productivity with C++ embedded domain specific languages

    Science.gov (United States)

    Kozacik, Stephen; Chao, Evenie; Paolini, Aaron; Bonnett, James; Kelmelis, Eric

    2017-05-01

    Domain-specific languages are a useful tool for productivity allowing domain experts to program using familiar concepts and vocabulary while benefiting from performance choices made by computing experts. Embedding the domain specific language into an existing language allows easy interoperability with non-domain-specific code and use of standard compilers and build systems. In C++, this is enabled through the template and preprocessor features. C++ embedded domain specific languages (EDSLs) allow the user to write simple, safe, performant, domain specific code that has access to all the low-level functionality that C and C++ offer as well as the diverse set of libraries available in the C/C++ ecosystem. In this paper, we will discuss several tools available for building EDSLs in C++ and show examples of projects successfully leveraging EDSLs. Modern C++ has added many useful new features to the language which we have leveraged to further extend the capability of EDSLs. At EM Photonics, we have used EDSLs to allow developers to transparently benefit from using high performance computing (HPC) hardware. We will show ways EDSLs combine with existing technologies and EM Photonics high performance tools and libraries to produce clean, short, high performance code in ways that were not previously possible.

  15. Web Syndication Approaches for Sharing Primary Data in "Small Science" Domains

    Directory of Open Access Journals (Sweden)

    Eric C Kansa

    2010-06-01

    Full Text Available In some areas of science, sophisticated web services and semantics underlie "cyberinfrastructure". However, in "small science" domains, especially in field sciences such as archaeology, conservation, and public health, datasets often resist standardization. Publishing data in the small sciences should embrace this diversity rather than attempt to corral research into "universal" (domain standards. A growing ecosystem of increasingly powerful Web syndication based approaches for sharing data on the public Web can offer a viable approach. Atom Feed based services can be used with scientific collections to identify and create linkages across different datasets, even across disciplinary boundaries without shared domain standards.

  16. AN INTELLIGENT CONVERSATION AGENT FOR HEALTH CARE DOMAIN

    Directory of Open Access Journals (Sweden)

    K. Karpagam

    2014-04-01

    Full Text Available Human Computer Interaction is one of the pervasive application areas of computer science to develop with multimodal interaction for information sharings. The conversation agent acts as the major core area for developing interfaces between a system and user with applied AI for proper responses. In this paper, the interactive system plays a vital role in improving knowledge in the domain of health through the intelligent interface between machine and human with text and speech. The primary aim is to enrich the knowledge and help the user in the domain of health using conversation agent to offer immediate response with human companion feel.

  17. Domain interaction in rabbit muscle pyruvate kinase. II. Small angle neutron scattering and computer simulation.

    Science.gov (United States)

    Consler, T G; Uberbacher, E C; Bunick, G J; Liebman, M N; Lee, J C

    1988-02-25

    The effects of ligands on the structure of rabbit muscle pyruvate kinase were studied by small angle neutron scattering. The radius of gyration, RG, decreases by about 1 A in the presence of the substrate phosphoenolpyruvate, but increases by about the same magnitude in the presence of the allosteric inhibitor phenylalanine. With increasing pH or in the absence of Mg2+ and K+, the RG of pyruvate kinase increases. Hence, there is a 2-A difference in RG between two alternative conformations. Length distribution analysis indicates that, under all experimental conditions which increase the radius of gyration, there is a pronounced increase observed in the probability for interatomic distance between 80 and 110 A. These small angle neutron scattering results indicate a "contraction" and "expansion" of the enzyme when it transforms between its active and inactive forms. Using the alpha-carbon coordinates of crystalline cat muscle pyruvate kinase, a length distribution profile was calculated, and it matches the scattering profile of the inactive form. These observations are expected since the crystals were grown in the absence of divalent cations (Stuart, D. I., Levine, M., Muirhead, H., and Stammers, D. K. (1979) J. Mol. Biol. 134, 109-142). Hence, results from neutron scattering, x-ray crystallographic, and sedimentation studies (Oberfelder, R. W., Lee, L. L.-Y., and Lee, J.C. (1984) Biochemistry 23, 3813-3821) are totally consistent with each other. With the aid of computer modeling, the crystal structure has been manipulated in order to effect changes that are consistent with the conformational change described by the solution scattering data. The structural manipulation involves the rotation of the B domain relative to the A domain, leading to the closure of the cleft between these domains. These manipulations resulted in the generation of new sets of atomic (C-alpha) coordinates, which were utilized in calculations, the result of which compared favorably with the

  18. Time and frequency domain analyses of the Hualien Large-Scale Seismic Test

    International Nuclear Information System (INIS)

    Kabanda, John; Kwon, Oh-Sung; Kwon, Gunup

    2015-01-01

    Highlights: • Time- and frequency-domain analysis methods are verified against each other. • The two analysis methods are validated against Hualien LSST. • The nonlinear time domain (NLTD) analysis resulted in more realistic response. • The frequency domain (FD) analysis shows amplification at resonant frequencies. • The NLTD analysis requires significant modeling and computing time. - Abstract: In the nuclear industry, the equivalent-linear frequency domain analysis method has been the de facto standard procedure primarily due to the method's computational efficiency. This study explores the feasibility of applying the nonlinear time domain analysis method for the soil–structure-interaction analysis of nuclear power facilities. As a first step, the equivalency of the time and frequency domain analysis methods is verified through a site response analysis of one-dimensional soil, a dynamic impedance analysis of soil–foundation system, and a seismic response analysis of the entire soil–structure system. For the verifications, an idealized elastic soil–structure system is used to minimize variables in the comparison of the two methods. Then, the verified analysis methods are used to develop time and frequency domain models of Hualien Large-Scale Seismic Test. The predicted structural responses are compared against field measurements. The models are also analyzed with an amplified ground motion to evaluate discrepancies of the time and frequency domain analysis methods when the soil–structure system behaves beyond the elastic range. The analysis results show that the equivalent-linear frequency domain analysis method amplifies certain frequency bands and tends to result in higher structural acceleration than the nonlinear time domain analysis method. A comparison with field measurements shows that the nonlinear time domain analysis method better captures the frequency distribution of recorded structural responses than the frequency domain

  19. 36 CFR 1254.32 - What rules apply to public access use of the Internet on NARA-supplied computers?

    Science.gov (United States)

    2010-07-01

    ... access use of the Internet on NARA-supplied computers? 1254.32 Section 1254.32 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION PUBLIC AVAILABILITY AND USE USING RECORDS AND DONATED... for Internet use in all NARA research rooms. The number of workstations varies per location. We...

  20. An Efficient Semi-supervised Learning Approach to Predict SH2 Domain Mediated Interactions.

    Science.gov (United States)

    Kundu, Kousik; Backofen, Rolf

    2017-01-01

    Src homology 2 (SH2) domain is an important subclass of modular protein domains that plays an indispensable role in several biological processes in eukaryotes. SH2 domains specifically bind to the phosphotyrosine residue of their binding peptides to facilitate various molecular functions. For determining the subtle binding specificities of SH2 domains, it is very important to understand the intriguing mechanisms by which these domains recognize their target peptides in a complex cellular environment. There are several attempts have been made to predict SH2-peptide interactions using high-throughput data. However, these high-throughput data are often affected by a low signal to noise ratio. Furthermore, the prediction methods have several additional shortcomings, such as linearity problem, high computational complexity, etc. Thus, computational identification of SH2-peptide interactions using high-throughput data remains challenging. Here, we propose a machine learning approach based on an efficient semi-supervised learning technique for the prediction of 51 SH2 domain mediated interactions in the human proteome. In our study, we have successfully employed several strategies to tackle the major problems in computational identification of SH2-peptide interactions.

  1. Calculation of nonzero-temperature Casimir forces in the time domain

    International Nuclear Information System (INIS)

    Pan, Kai; Reid, M. T. Homer; McCauley, Alexander P.; Rodriguez, Alejandro W.; White, Jacob K.; Johnson, Steven G.

    2011-01-01

    We show how to compute Casimir forces at nonzero temperatures with time-domain electromagnetic simulations, for example, using a finite-difference time-domain (FDTD) method. Compared to our previous zero-temperature time-domain method, only a small modification is required, but we explain that some care is required to properly capture the zero-frequency contribution. We validate the method against analytical and numerical frequency-domain calculations, and show a surprising high-temperature disappearance of a nonmonotonic behavior previously demonstrated in a pistonlike geometry.

  2. The insurance industry and public-private collaborations as a vector to develop and spread EO technologies and techniques in the domain of Food Security: The Swiss Re case.

    Science.gov (United States)

    Coutu, S.; Ragaz, M.; Mäder, D.; Hammer, P.; Andriesse, M.; Güttinger, U.; Feyen, H.

    2017-12-01

    The insurance industry has been contributing to the resilient development of agriculture in multiple regions of the globe since the beginning of the 19th Century. It also has from the very beginning of the development of EO Sciences, kept a very close eye on the development of technologies and techniques in this domain. Recent advances in this area such as increased satellite imagery resolution, faster computation time and Big Data management combined with the ground-based knowledge from the insurance industry have offered farmers not only tools permitting better crop management, but also reliable and live yield coverage. This study presents several of these applications at different scales (industrial farming and micro-farming) and in different climate regions, with an emphasis on the limit of current products. Some of these limits such as lack of access of to ground data, R&D efforts or understanding of ground needs could be quickly overcome through closer public-private or private-private collaborations. However, despite a clear benefit for the Food Security nexus and potential win-win situations, those collaborations are not always simple to develop. We present here successful but also disappointing collaboration cases based on the Swiss Re experience, as a global insurance leader. As a conclusion, we highlight how academia, NGOs, governmental organization, start-ups and the insurance industry can get together to foster the development of EO in the domain of Food Security, and bring cutting-edge science to game changing industrial applications.

  3. 76 FR 12398 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Bureau of the Public Debt (BPD...

    Science.gov (United States)

    2011-03-07

    ... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2010-0034] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Bureau of the Public Debt (BPD))--Match Number 1304 AGENCY: Social Security... as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection...

  4. Usage of Cloud Computing Simulators and Future Systems For Computational Research

    OpenAIRE

    Lakshminarayanan, Ramkumar; Ramalingam, Rajasekar

    2016-01-01

    Cloud Computing is an Internet based computing, whereby shared resources, software and information, are provided to computers and devices on demand, like the electricity grid. Currently, IaaS (Infrastructure as a Service), PaaS (Platform as a Service) and SaaS (Software as a Service) are used as a business model for Cloud Computing. Nowadays, the adoption and deployment of Cloud Computing is increasing in various domains, forcing researchers to conduct research in the area of Cloud Computing ...

  5. A survey of current trends in computational drug repositioning.

    Science.gov (United States)

    Li, Jiao; Zheng, Si; Chen, Bin; Butte, Atul J; Swamidass, S Joshua; Lu, Zhiyong

    2016-01-01

    Computational drug repositioning or repurposing is a promising and efficient tool for discovering new uses from existing drugs and holds the great potential for precision medicine in the age of big data. The explosive growth of large-scale genomic and phenotypic data, as well as data of small molecular compounds with granted regulatory approval, is enabling new developments for computational repositioning. To achieve the shortest path toward new drug indications, advanced data processing and analysis strategies are critical for making sense of these heterogeneous molecular measurements. In this review, we show recent advancements in the critical areas of computational drug repositioning from multiple aspects. First, we summarize available data sources and the corresponding computational repositioning strategies. Second, we characterize the commonly used computational techniques. Third, we discuss validation strategies for repositioning studies, including both computational and experimental methods. Finally, we highlight potential opportunities and use-cases, including a few target areas such as cancers. We conclude with a brief discussion of the remaining challenges in computational drug repositioning. Published by Oxford University Press 2015. This work is written by US Government employees and is in the public domain in the US.

  6. THE USE OF COMPUTER APPLICATIONS IN THE STUDY OF ROMANIA'S PUBLIC DEBT

    Directory of Open Access Journals (Sweden)

    Popeanga Vasile

    2011-07-01

    Full Text Available Total public debt represents all monetary obligations of the state (government, public institutions, financial, administrative-territorial units at a time, resulting from internal and external loans (in lei and foreign currencies contracted on short, medium and long term, and the state treasury and its own obligations for the amounts advanced temporarily to cover the budget deficit. Loans may be contracted by the state through the Ministry of Finance, in his own name or guaranteed by it. Public debt is expressed in local currency or foreign currency, depending on where the contracts and loan conditions. In order to evaluate Romania's public debt, obligations denominated in another currency than the national currency is calculated using the exchange rate of National Bank of Romania. Also, total public debt of a country can be expressed in absolute values (to know the load on that country's economy which is subject to its creditors, the relative values as a percentage of GDP (to allow comparison over time and between countries and the average size per capita (to allow comparisons and analysis in time and space. Total public debt is calculated and separately manages its two forms, namely domestic public debt and external public debt. Ministry of Finance shall prepare and submit annually to the Government for approval and to Parliament for information, report on public debt, which contains information on government debt portfolio, debt service, public indebtedness indicators and information about primary and secondary market securities state and how to implement the medium-term strategy in managing government debt for the previous year. In order to make comparisons quick and effective on public debt dynamics in Romania, Excel 2010 has new features such as charts and sparkline slicers features which can help discover trends and statistics in accordance with existing data. The aim of this article is accurate assessment of Romania's public debt and its

  7. An efficient domain decomposition strategy for wave loads on surface piercing circular cylinders

    DEFF Research Database (Denmark)

    Paulsen, Bo Terp; Bredmose, Henrik; Bingham, Harry B.

    2014-01-01

    A fully nonlinear domain decomposed solver is proposed for efficient computations of wave loads on surface piercing structures in the time domain. A fully nonlinear potential flow solver was combined with a fully nonlinear Navier–Stokes/VOF solver via generalized coupling zones of arbitrary shape....... Sensitivity tests of the extent of the inner Navier–Stokes/VOF domain were carried out. Numerical computations of wave loads on surface piercing circular cylinders at intermediate water depths are presented. Four different test cases of increasing complexity were considered; 1) weakly nonlinear regular waves...

  8. Time-Domain Techniques for Computation and Reconstruction of One-Dimensional Profiles

    Directory of Open Access Journals (Sweden)

    M. Rahman

    2005-01-01

    Full Text Available This paper presents a time-domain technique to compute the electromagnetic fields and to reconstruct the permittivity profile within a one-dimensional medium of finite length. The medium is characterized by a permittivity as well as conductivity profile which vary only with depth. The discussed scattering problem is thus one-dimensional. The modeling tool is divided into two different schemes which are named as the forward solver and the inverse solver. The task of the forward solver is to compute the internal fields of the specimen which is performed by Green’s function approach. When a known electromagnetic wave is incident normally on the media, the resulting electromagnetic field within the media can be calculated by constructing a Green’s operator. This operator maps the incident field on either side of the medium to the field at an arbitrary observation point. It is nothing but a matrix of integral operators with kernels satisfying known partial differential equations. The reflection and transmission behavior of the medium is also determined from the boundary values of the Green's operator. The inverse solver is responsible for solving an inverse scattering problem by reconstructing the permittivity profile of the medium. Though it is possible to use several algorithms to solve this problem, the invariant embedding method, also known as the layer-stripping method, has been implemented here due to the advantage that it requires a finite time trace of reflection data. Here only one round trip of reflection data is used, where one round trip is defined by the time required by the pulse to propagate through the medium and back again. The inversion process begins by retrieving the reflection kernel from the reflected wave data by simply using a deconvolution technique. The rest of the task can easily be performed by applying a numerical approach to determine different profile parameters. Both the solvers have been found to have the

  9. Cross-Sectional Associations between Home Environmental Factors and Domain-Specific Sedentary Behaviors in Adults: The Moderating Role of Socio-Demographic Variables and BMI

    Science.gov (United States)

    Busschaert, Cedric; Cardon, Greet; Chastin, Sebastien F. M.; Van Cauwenberg, Jelle; De Cocker, Katrien

    2017-01-01

    Despite the negative health effects of too much sitting, the majority of adults are too sedentary. To develop effective interventions, insight is needed into home environmental correlates of adults’ sedentary behaviors, and into the susceptibility of population subgroups to these home environmental cues. In total, 559 Flemish adults reported socio-demographics, weight and height, home environmental factors and domain-specific sedentary behaviors. Generalized linear modeling was conducted to examine main associations between home environmental factors and domain-specific sedentary behaviors, and to test the moderating role of socio-demographics and BMI on these associations. In case of significant interactions, stratified analyses were performed. Results showed that, among those who did use a computer/laptop during the last week, a one-unit increase in the number of computers or laptops was associated with 17% (OR = 1.17; 95% CI = 1.02, 1.34) and 24% (OR = 1.24; 95% CI = 1.08, 1.43) more minutes computer time per day, respectively. The proximity of the remote controller (p moderated by BMI, with significant positive associations limited to those not overweight. To conclude, home environmental factors were associated with domain-specific sedentary behaviors, especially in healthy weight adults. If confirmed by longitudinal studies, public health professionals should encourage adults to limit the number of indoor entertainment devices and motorized vehicles. PMID:29088089

  10. A Cross-Domain Collaborative Filtering Algorithm Based on Feature Construction and Locally Weighted Linear Regression.

    Science.gov (United States)

    Yu, Xu; Lin, Jun-Yu; Jiang, Feng; Du, Jun-Wei; Han, Ji-Zhong

    2018-01-01

    Cross-domain collaborative filtering (CDCF) solves the sparsity problem by transferring rating knowledge from auxiliary domains. Obviously, different auxiliary domains have different importance to the target domain. However, previous works cannot evaluate effectively the significance of different auxiliary domains. To overcome this drawback, we propose a cross-domain collaborative filtering algorithm based on Feature Construction and Locally Weighted Linear Regression (FCLWLR). We first construct features in different domains and use these features to represent different auxiliary domains. Thus the weight computation across different domains can be converted as the weight computation across different features. Then we combine the features in the target domain and in the auxiliary domains together and convert the cross-domain recommendation problem into a regression problem. Finally, we employ a Locally Weighted Linear Regression (LWLR) model to solve the regression problem. As LWLR is a nonparametric regression method, it can effectively avoid underfitting or overfitting problem occurring in parametric regression methods. We conduct extensive experiments to show that the proposed FCLWLR algorithm is effective in addressing the data sparsity problem by transferring the useful knowledge from the auxiliary domains, as compared to many state-of-the-art single-domain or cross-domain CF methods.

  11. Rascal: A domain specific language for source code analysis and manipulation

    NARCIS (Netherlands)

    P. Klint (Paul); T. van der Storm (Tijs); J.J. Vinju (Jurgen); A. Walenstein; S. Schuppe

    2009-01-01

    htmlabstractMany automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This

  12. RASCAL : a domain specific language for source code analysis and manipulationa

    NARCIS (Netherlands)

    Klint, P.; Storm, van der T.; Vinju, J.J.

    2009-01-01

    Many automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This impedance

  13. The DIMA web resource--exploring the protein domain network.

    Science.gov (United States)

    Pagel, Philipp; Oesterheld, Matthias; Stümpflen, Volker; Frishman, Dmitrij

    2006-04-15

    Conserved domains represent essential building blocks of most known proteins. Owing to their role as modular components carrying out specific functions they form a network based both on functional relations and direct physical interactions. We have previously shown that domain interaction networks provide substantially novel information with respect to networks built on full-length protein chains. In this work we present a comprehensive web resource for exploring the Domain Interaction MAp (DIMA), interactively. The tool aims at integration of multiple data sources and prediction techniques, two of which have been implemented so far: domain phylogenetic profiling and experimentally demonstrated domain contacts from known three-dimensional structures. A powerful yet simple user interface enables the user to compute, visualize, navigate and download domain networks based on specific search criteria. http://mips.gsf.de/genre/proj/dima

  14. Upgrade Software and Computing

    CERN Document Server

    The LHCb Collaboration, CERN

    2018-01-01

    This document reports the Research and Development activities that are carried out in the software and computing domains in view of the upgrade of the LHCb experiment. The implementation of a full software trigger implies major changes in the core software framework, in the event data model, and in the reconstruction algorithms. The increase of the data volumes for both real and simulated datasets requires a corresponding scaling of the distributed computing infrastructure. An implementation plan in both domains is presented, together with a risk assessment analysis.

  15. CARDS: A blueprint and environment for domain-specific software reuse

    Science.gov (United States)

    Wallnau, Kurt C.; Solderitsch, Anne Costa; Smotherman, Catherine

    1992-01-01

    CARDS (Central Archive for Reusable Defense Software) exploits advances in domain analysis and domain modeling to identify, specify, develop, archive, retrieve, understand, and reuse domain-specific software components. An important element of CARDS is to provide visibility into the domain model artifacts produced by, and services provided by, commercial computer-aided software engineering (CASE) technology. The use of commercial CASE technology is important to provide rich, robust support for the varied roles involved in a reuse process. We refer to this kind of use of knowledge representation systems as supporting 'knowledge-based integration.'

  16. Opening of energy markets: consequences on the missions of public utility and of security of supplies in the domain of electric power and gas

    International Nuclear Information System (INIS)

    2001-01-01

    This conference was jointly organized by the International Energy Agency (IEA) and the French ministry of economy, finances, and industry (general direction of energy and raw materials, DGEMP). It was organized in 6 sessions dealing with: 1 - the public utility in the domain of energy: definition of the public utility missions, experience feedback about liberalized markets, public utility obligation and pricing regulation; 2 - the new US energy policy and the lessons learnt from the California crisis; 3 - the security of electric power supplies: concepts of security of supplies, opinion of operators, security of power supplies versus liberalization and investments; 4 - security of gas supplies: markets liberalization and investments, long-term contracts and security of supplies; 5 - debate: how to integrate the objectives of public utility and of security of supplies in a competing market; 6 - conclusions. This document brings together the available talks and transparencies presented at the conference. (J.S.)

  17. Moving domain computational fluid dynamics to interface with an embryonic model of cardiac morphogenesis.

    Directory of Open Access Journals (Sweden)

    Juhyun Lee

    Full Text Available Peristaltic contraction of the embryonic heart tube produces time- and spatial-varying wall shear stress (WSS and pressure gradients (∇P across the atrioventricular (AV canal. Zebrafish (Danio rerio are a genetically tractable system to investigate cardiac morphogenesis. The use of Tg(fli1a:EGFP (y1 transgenic embryos allowed for delineation and two-dimensional reconstruction of the endocardium. This time-varying wall motion was then prescribed in a two-dimensional moving domain computational fluid dynamics (CFD model, providing new insights into spatial and temporal variations in WSS and ∇P during cardiac development. The CFD simulations were validated with particle image velocimetry (PIV across the atrioventricular (AV canal, revealing an increase in both velocities and heart rates, but a decrease in the duration of atrial systole from early to later stages. At 20-30 hours post fertilization (hpf, simulation results revealed bidirectional WSS across the AV canal in the heart tube in response to peristaltic motion of the wall. At 40-50 hpf, the tube structure undergoes cardiac looping, accompanied by a nearly 3-fold increase in WSS magnitude. At 110-120 hpf, distinct AV valve, atrium, ventricle, and bulbus arteriosus form, accompanied by incremental increases in both WSS magnitude and ∇P, but a decrease in bi-directional flow. Laminar flow develops across the AV canal at 20-30 hpf, and persists at 110-120 hpf. Reynolds numbers at the AV canal increase from 0.07±0.03 at 20-30 hpf to 0.23±0.07 at 110-120 hpf (p< 0.05, n=6, whereas Womersley numbers remain relatively unchanged from 0.11 to 0.13. Our moving domain simulations highlights hemodynamic changes in relation to cardiac morphogenesis; thereby, providing a 2-D quantitative approach to complement imaging analysis.

  18. Screen time by different devices in adolescents: association with physical inactivity domains and eating habits.

    Science.gov (United States)

    Delfino, Leandro D; Dos Santos Silva, Diego A; Tebar, William R; Zanuto, Edner F; Codogno, Jamile S; Fernandes, Rômulo A; Christofaro, Diego G

    2018-03-01

    Sedentary behaviors in adolescents are associated with using screen devices, analyzed as the total daily time in television viewing, using the computer and video game. However, an independent and clustered analysis of devices allows greater understanding of associations with physical inactivity domains and eating habits in adolescents. Sample of adolescents aged 10-17 years (N.=1011) from public and private schools, randomly selected. The use of screen devices was measured by hours per week spent in each device: TV, computer, videogames and mobile phone/tablet. Physical inactivity domains (school, leisure and sports), eating habits (weekly food consumption frequency) and socioeconomic status were assessed by questionnaire. The prevalence of high use of mobile phone/tablet was 70% among adolescents, 63% showed high use of TV or computer and 24% reported high use of videogames. High use of videogames was greater among boys and high use of mobile phone/tablet was higher among girls. Significant associations of high use of TV (OR=1.43, 95% CI: 1.04-1.99), computer (OR=1.44, 95% CI: 1.03-2.02), videogames (OR=1.65, 95% CI: 1.13-2.69) and consumption of snacks were observed. High use of computer was associated with fried foods consumption (OR=1.32, 95% CI: 1.01-1.75) and physical inactivity (OR=1.41, 95% CI: 1.03-1.95). Mobile phone was associated with consumption of sweets (OR=1.33, 95% CI: 1.00-1.80). Cluster using screen devices showed associations with high consumption of snacks, fried foods and sweets, even after controlling for confounding variables. The high use of screen devices was associated with high consumption of snacks, fried foods, sweets and physical inactivity in adolescents.

  19. Construction and Design Kits: Human Problem-Domain Communication

    National Research Council Canada - National Science Library

    Fischer, Gerhard; Lemke, Andreas C

    1987-01-01

    .... To provide the user with the appropriate level of control and a better understanding, we have to replace human-computer communication with human problem-domain communication, which allows users...

  20. Artificial intelligence and tutoring systems computational and cognitive approaches to the communication of knowledge

    CERN Document Server

    Wenger, Etienne

    2014-01-01

    Artificial Intelligence and Tutoring Systems: Computational and Cognitive Approaches to the Communication of Knowledge focuses on the cognitive approaches, methodologies, principles, and concepts involved in the communication of knowledge. The publication first elaborates on knowledge communication systems, basic issues, and tutorial dialogues. Concerns cover natural reasoning and tutorial dialogues, shift from local strategies to multiple mental models, domain knowledge, pedagogical knowledge, implicit versus explicit encoding of knowledge, knowledge communication, and practical and theoretic

  1. Direct time-domain techniques for transient radiation and scattering

    International Nuclear Information System (INIS)

    Miller, E.K.; Landt, J.A.

    1976-01-01

    A tutorial introduction to transient electromagnetics, focusing on direct time-domain techniques, is presented. Physical, mathematical, numerical, and experimental aspects of time-domain methods, with emphasis on wire objects excited as antennas or scatters are examined. Numerous computed examples illustrate the characteristics of direct time-domain procedures, especially where they may offer advantages over procedures in the more familiar frequency domain. These advantages include greater solution efficiency for many types of problems, the ability to handle nonlinearities, improved physical insight and interpretability, availability of wide-band information from a single calculation, and the possibility of isolating interactions among various parts of an object using time-range gating

  2. Climiate Resilience Screening Index and Domain Scores

    Data.gov (United States)

    U.S. Environmental Protection Agency — CRSI and related-domain scores for all 50 states and 3135 counties in the U.S. This dataset is not publicly accessible because: They are already available within the...

  3. Astronomy and Computing: A new journal for the astronomical computing community

    NARCIS (Netherlands)

    Accomazzi, A.; Budavári, T.; Fluke, C.; Gray, N.; Mann, R.G.; O'Mullane, W.; Wicenec, A.; Wise, M.

    2013-01-01

    We introduce Astronomy and Computing, a new journal for the growing population of people working in the domain where astronomy overlaps with computer science and information technology. The journal aims to provide a new communication channel within that community, which is not well served by current

  4. A Cross-Domain Collaborative Filtering Algorithm Based on Feature Construction and Locally Weighted Linear Regression

    Directory of Open Access Journals (Sweden)

    Xu Yu

    2018-01-01

    Full Text Available Cross-domain collaborative filtering (CDCF solves the sparsity problem by transferring rating knowledge from auxiliary domains. Obviously, different auxiliary domains have different importance to the target domain. However, previous works cannot evaluate effectively the significance of different auxiliary domains. To overcome this drawback, we propose a cross-domain collaborative filtering algorithm based on Feature Construction and Locally Weighted Linear Regression (FCLWLR. We first construct features in different domains and use these features to represent different auxiliary domains. Thus the weight computation across different domains can be converted as the weight computation across different features. Then we combine the features in the target domain and in the auxiliary domains together and convert the cross-domain recommendation problem into a regression problem. Finally, we employ a Locally Weighted Linear Regression (LWLR model to solve the regression problem. As LWLR is a nonparametric regression method, it can effectively avoid underfitting or overfitting problem occurring in parametric regression methods. We conduct extensive experiments to show that the proposed FCLWLR algorithm is effective in addressing the data sparsity problem by transferring the useful knowledge from the auxiliary domains, as compared to many state-of-the-art single-domain or cross-domain CF methods.

  5. A thermodynamic definition of protein domains.

    Science.gov (United States)

    Porter, Lauren L; Rose, George D

    2012-06-12

    Protein domains are conspicuous structural units in globular proteins, and their identification has been a topic of intense biochemical interest dating back to the earliest crystal structures. Numerous disparate domain identification algorithms have been proposed, all involving some combination of visual intuition and/or structure-based decomposition. Instead, we present a rigorous, thermodynamically-based approach that redefines domains as cooperative chain segments. In greater detail, most small proteins fold with high cooperativity, meaning that the equilibrium population is dominated by completely folded and completely unfolded molecules, with a negligible subpopulation of partially folded intermediates. Here, we redefine structural domains in thermodynamic terms as cooperative folding units, based on m-values, which measure the cooperativity of a protein or its substructures. In our analysis, a domain is equated to a contiguous segment of the folded protein whose m-value is largely unaffected when that segment is excised from its parent structure. Defined in this way, a domain is a self-contained cooperative unit; i.e., its cooperativity depends primarily upon intrasegment interactions, not intersegment interactions. Implementing this concept computationally, the domains in a large representative set of proteins were identified; all exhibit consistency with experimental findings. Specifically, our domain divisions correspond to the experimentally determined equilibrium folding intermediates in a set of nine proteins. The approach was also proofed against a representative set of 71 additional proteins, again with confirmatory results. Our reframed interpretation of a protein domain transforms an indeterminate structural phenomenon into a quantifiable molecular property grounded in solution thermodynamics.

  6. Transcript structure and domain display: a customizable transcript visualization tool.

    Science.gov (United States)

    Watanabe, Kenneth A; Ma, Kaiwang; Homayouni, Arielle; Rushton, Paul J; Shen, Qingxi J

    2016-07-01

    Transcript Structure and Domain Display (TSDD) is a publicly available, web-based program that provides publication quality images of transcript structures and domains. TSDD is capable of producing transcript structures from GFF/GFF3 and BED files. Alternatively, the GFF files of several model organisms have been pre-loaded so that users only needs to enter the locus IDs of the transcripts to be displayed. Visualization of transcripts provides many benefits to researchers, ranging from evolutionary analysis of DNA-binding domains to predictive function modeling. TSDD is freely available for non-commercial users at http://shenlab.sols.unlv.edu/shenlab/software/TSD/transcript_display.html : jeffery.shen@unlv.nevada.edu. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. Domain Adaptation for Pedestrian Detection Based on Prediction Consistency

    Directory of Open Access Journals (Sweden)

    Yu Li-ping

    2014-01-01

    Full Text Available Pedestrian detection is an active area of research in computer vision. It remains a quite challenging problem in many applications where many factors cause a mismatch between source dataset used to train the pedestrian detector and samples in the target scene. In this paper, we propose a novel domain adaptation model for merging plentiful source domain samples with scared target domain samples to create a scene-specific pedestrian detector that performs as well as rich target domain simples are present. Our approach combines the boosting-based learning algorithm with an entropy-based transferability, which is derived from the prediction consistency with the source classifications, to selectively choose the samples showing positive transferability in source domains to the target domain. Experimental results show that our approach can improve the detection rate, especially with the insufficient labeled data in target scene.

  8. Using context to improve protein domain identification

    Directory of Open Access Journals (Sweden)

    Llinás Manuel

    2011-03-01

    Full Text Available Abstract Background Identifying domains in protein sequences is an important step in protein structural and functional annotation. Existing domain recognition methods typically evaluate each domain prediction independently of the rest. However, the majority of proteins are multidomain, and pairwise domain co-occurrences are highly specific and non-transitive. Results Here, we demonstrate how to exploit domain co-occurrence to boost weak domain predictions that appear in previously observed combinations, while penalizing higher confidence domains if such combinations have never been observed. Our framework, Domain Prediction Using Context (dPUC, incorporates pairwise "context" scores between domains, along with traditional domain scores and thresholds, and improves domain prediction across a variety of organisms from bacteria to protozoa and metazoa. Among the genomes we tested, dPUC is most successful at improving predictions for the poorly-annotated malaria parasite Plasmodium falciparum, for which over 38% of the genome is currently unannotated. Our approach enables high-confidence annotations in this organism and the identification of orthologs to many core machinery proteins conserved in all eukaryotes, including those involved in ribosomal assembly and other RNA processing events, which surprisingly had not been previously known. Conclusions Overall, our results demonstrate that this new context-based approach will provide significant improvements in domain and function prediction, especially for poorly understood genomes for which the need for additional annotations is greatest. Source code for the algorithm is available under a GPL open source license at http://compbio.cs.princeton.edu/dpuc/. Pre-computed results for our test organisms and a web server are also available at that location.

  9. Orthogonal Multi-Carrier DS-CDMA with Frequency-Domain Equalization

    Science.gov (United States)

    Tanaka, Ken; Tomeba, Hiromichi; Adachi, Fumiyuki

    Orthogonal multi-carrier direct sequence code division multiple access (orthogonal MC DS-CDMA) is a combination of orthogonal frequency division multiplexing (OFDM) and time-domain spreading, while multi-carrier code division multiple access (MC-CDMA) is a combination of OFDM and frequency-domain spreading. In MC-CDMA, a good bit error rate (BER) performance can be achieved by using frequency-domain equalization (FDE), since the frequency diversity gain is obtained. On the other hand, the conventional orthogonal MC DS-CDMA fails to achieve any frequency diversity gain. In this paper, we propose a new orthogonal MC DS-CDMA that can obtain the frequency diversity gain by applying FDE. The conditional BER analysis is presented. The theoretical average BER performance in a frequency-selective Rayleigh fading channel is evaluated by the Monte-Carlo numerical computation method using the derived conditional BER and is confirmed by computer simulation of the orthogonal MC DS-CDMA signal transmission.

  10. Cross Domain Deterrence: Livermore Technical Report, 2014-2016

    Energy Technology Data Exchange (ETDEWEB)

    Barnes, Peter D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bahney, Ben [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Matarazzo, Celeste [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Markey, Michael [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pearl, Jonathan [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-08-03

    Lawrence Livermore National Laboratory (LLNL) is an original collaborator on the project titled “Deterring Complex Threats: The Effects of Asymmetry, Interdependence, and Multi-polarity on International Strategy,” (CDD Project) led by the UC Institute on Global Conflict and Cooperation at UCSD under PIs Jon Lindsay and Erik Gartzke , and funded through the DoD Minerva Research Initiative. In addition to participating in workshops and facilitating interaction among UC social scientists, LLNL is leading the computational modeling effort and assisting with empirical case studies to probe the viability of analytic, modeling and data analysis concepts. This report summarizes LLNL work on the CDD Project to date, primarily in Project Years 1-2, corresponding to Federal fiscal year 2015. LLNL brings two unique domains of expertise to bear on this Project: (1) access to scientific expertise on the technical dimensions of emerging threat technology, and (2) high performance computing (HPC) expertise, required for analyzing the complexity of bargaining interactions in the envisioned threat models. In addition, we have a small group of researchers trained as social scientists who are intimately familiar with the International Relations research. We find that pairing simulation scientists, who are typically trained in computer science, with domain experts, social scientists in this case, is the most effective route to developing powerful new simulation tools capable of representing domain concepts accurately and answering challenging questions in the field.

  11. Minimum-domain impulse theory for unsteady aerodynamic force

    Science.gov (United States)

    Kang, L. L.; Liu, L. Q.; Su, W. D.; Wu, J. Z.

    2018-01-01

    We extend the impulse theory for unsteady aerodynamics from its classic global form to finite-domain formulation then to minimum-domain form and from incompressible to compressible flows. For incompressible flow, the minimum-domain impulse theory raises the finding of Li and Lu ["Force and power of flapping plates in a fluid," J. Fluid Mech. 712, 598-613 (2012)] to a theorem: The entire force with discrete wake is completely determined by only the time rate of impulse of those vortical structures still connecting to the body, along with the Lamb-vector integral thereof that captures the contribution of all the rest disconnected vortical structures. For compressible flows, we find that the global form in terms of the curl of momentum ∇ × (ρu), obtained by Huang [Unsteady Vortical Aerodynamics (Shanghai Jiaotong University Press, 1994)], can be generalized to having an arbitrary finite domain, but the formula is cumbersome and in general ∇ × (ρu) no longer has discrete structures and hence no minimum-domain theory exists. Nevertheless, as the measure of transverse process only, the unsteady field of vorticity ω or ρω may still have a discrete wake. This leads to a minimum-domain compressible vorticity-moment theory in terms of ρω (but it is beyond the classic concept of impulse). These new findings and applications have been confirmed by our numerical experiments. The results not only open an avenue to combine the theory with computation-experiment in wide applications but also reveal a physical truth that it is no longer necessary to account for all wake vortical structures in computing the force and moment.

  12. Modeling and Analysis Compute Environments, Utilizing Virtualization Technology in the Climate and Earth Systems Science domain

    Science.gov (United States)

    Michaelis, A.; Nemani, R. R.; Wang, W.; Votava, P.; Hashimoto, H.

    2010-12-01

    Given the increasing complexity of climate modeling and analysis tools, it is often difficult and expensive to build or recreate an exact replica of the software compute environment used in past experiments. With the recent development of new technologies for hardware virtualization, an opportunity exists to create full modeling, analysis and compute environments that are “archiveable”, transferable and may be easily shared amongst a scientific community or presented to a bureaucratic body if the need arises. By encapsulating and entire modeling and analysis environment in a virtual machine image, others may quickly gain access to the fully built system used in past experiments, potentially easing the task and reducing the costs of reproducing and verify past results produced by other researchers. Moreover, these virtual machine images may be used as a pedagogical tool for others that are interested in performing an academic exercise but don't yet possess the broad expertise required. We built two virtual machine images, one with the Community Earth System Model (CESM) and one with Weather Research Forecast Model (WRF), then ran several small experiments to assess the feasibility, performance overheads costs, reusability, and transferability. We present a list of the pros and cons as well as lessoned learned from utilizing virtualization technology in the climate and earth systems modeling domain.

  13. Coupling between Current and Dynamic Magnetization : from Domain Walls to Spin Waves

    Science.gov (United States)

    Lucassen, M. E.

    2012-05-01

    So far, we have derived some general expressions for domain-wall motion and the spin motive force. We have seen that the β parameter plays a large role in both subjects. In all chapters of this thesis, there is an emphasis on the determination of this parameter. We also know how to incorporate thermal fluctuations for rigid domain walls, as shown above. In Chapter 2, we study a different kind of fluctuations: shot noise. This noise is caused by the fact that an electric current consists of electrons, and therefore has fluctuations. In the process, we also compute transmission and reflection coefficients for a rigid domain wall, and from them the linear momentum transfer. More work on fluctuations is done in Chapter 3. Here, we consider a (extrinsically pinned) rigid domain wall under the influence of thermal fluctuations that induces a current via spin motive force. We compute how the resulting noise in the current is related to the β parameter. In Chapter 4 we look into in more detail into the spin motive forces from field driven domain walls. Using micro magnetic simulations, we compute the spin motive force due to vortex domain walls explicitly. As mentioned before, this gives qualitatively different results than for a rigid domain wall. The final subject in Chapter 5 is the application of the general expression for spin motive forces to magnons. Although this might seem to be unrelated to domain-wall motion, this calculation allows us to relate the β parameter to macroscopic transport coefficients. This work was supported by Stichting voor Fundamenteel Onderzoek der Materie (FOM), the Netherlands Organization for Scientific Research (NWO), and by the European Research Council (ERC) under the Seventh Framework Program (FP7).

  14. Alternative to domain wall fermions

    International Nuclear Information System (INIS)

    Neuberger, H.

    2002-01-01

    An alternative to commonly used domain wall fermions is presented. Some rigorous bounds on the condition number of the associated linear problem are derived. On the basis of these bounds and some experimentation it is argued that domain wall fermions will in general be associated with a condition number that is of the same order of magnitude as the product of the condition number of the linear problem in the physical dimensions by the inverse bare quark mass. Thus, the computational cost of implementing true domain wall fermions using a single conjugate gradient algorithm is of the same order of magnitude as that of implementing the overlap Dirac operator directly using two nested conjugate gradient algorithms. At a cost of about a factor of two in operation count it is possible to make the memory usage of direct implementations of the overlap Dirac operator independent of the accuracy of the approximation to the sign function and of the same order as that of standard Wilson fermions

  15. Computer games and software engineering

    CERN Document Server

    Cooper, Kendra M L

    2015-01-01

    Computer games represent a significant software application domain for innovative research in software engineering techniques and technologies. Game developers, whether focusing on entertainment-market opportunities or game-based applications in non-entertainment domains, thus share a common interest with software engineers and developers on how to best engineer game software.Featuring contributions from leading experts in software engineering, the book provides a comprehensive introduction to computer game software development that includes its history as well as emerging research on the inte

  16. Information Dissemination of Public Health Emergency on Social Networks and Intelligent Computation.

    Science.gov (United States)

    Hu, Hongzhi; Mao, Huajuan; Hu, Xiaohua; Hu, Feng; Sun, Xuemin; Jing, Zaiping; Duan, Yunsuo

    2015-01-01

    Due to the extensive social influence, public health emergency has attracted great attention in today's society. The booming social network is becoming a main information dissemination platform of those events and caused high concerns in emergency management, among which a good prediction of information dissemination in social networks is necessary for estimating the event's social impacts and making a proper strategy. However, information dissemination is largely affected by complex interactive activities and group behaviors in social network; the existing methods and models are limited to achieve a satisfactory prediction result due to the open changeable social connections and uncertain information processing behaviors. ACP (artificial societies, computational experiments, and parallel execution) provides an effective way to simulate the real situation. In order to obtain better information dissemination prediction in social networks, this paper proposes an intelligent computation method under the framework of TDF (Theory-Data-Feedback) based on ACP simulation system which was successfully applied to the analysis of A (H1N1) Flu emergency.

  17. Efficient computation of clipped Voronoi diagram for mesh generation

    KAUST Repository

    Yan, Dongming

    2013-04-01

    The Voronoi diagram is a fundamental geometric structure widely used in various fields, especially in computer graphics and geometry computing. For a set of points in a compact domain (i.e. a bounded and closed 2D region or a 3D volume), some Voronoi cells of their Voronoi diagram are infinite or partially outside of the domain, but in practice only the parts of the cells inside the domain are needed, as when computing the centroidal Voronoi tessellation. Such a Voronoi diagram confined to a compact domain is called a clipped Voronoi diagram. We present an efficient algorithm to compute the clipped Voronoi diagram for a set of sites with respect to a compact 2D region or a 3D volume. We also apply the proposed method to optimal mesh generation based on the centroidal Voronoi tessellation. Crown Copyright © 2011 Published by Elsevier Ltd. All rights reserved.

  18. Efficient computation of clipped Voronoi diagram for mesh generation

    KAUST Repository

    Yan, Dongming; Wang, Wen Ping; Lé vy, Bruno L.; Liu, Yang

    2013-01-01

    The Voronoi diagram is a fundamental geometric structure widely used in various fields, especially in computer graphics and geometry computing. For a set of points in a compact domain (i.e. a bounded and closed 2D region or a 3D volume), some Voronoi cells of their Voronoi diagram are infinite or partially outside of the domain, but in practice only the parts of the cells inside the domain are needed, as when computing the centroidal Voronoi tessellation. Such a Voronoi diagram confined to a compact domain is called a clipped Voronoi diagram. We present an efficient algorithm to compute the clipped Voronoi diagram for a set of sites with respect to a compact 2D region or a 3D volume. We also apply the proposed method to optimal mesh generation based on the centroidal Voronoi tessellation. Crown Copyright © 2011 Published by Elsevier Ltd. All rights reserved.

  19. Absorption and scattering properties of arbitrarily shaped particles in the Rayleigh domain

    International Nuclear Information System (INIS)

    Min, M.; Hovenier, J.W.; Dominik, C.; Koter, A. de; Yurkin, M.A.

    2006-01-01

    We provide a theoretical foundation for the statistical approach for computing the absorption properties of particles in the Rayleigh domain. We present a general method based on the discrete dipole approximation to compute the absorption and scattering properties of particles in the Rayleigh domain. The method allows to separate the geometrical aspects of a particle from its material properties. Doing the computation of the optical properties of a particle once, provides them for any set of refractive indices, wavelengths and orientations. This allows for fast computations of e.g. absorption spectra of arbitrarily shaped particles. Other practical applications of the method are in the interpretation of atmospheric and radar measurements as well as computations of the scattering matrix of small particles as a function of the scattering angle. In the statistical approach, the optical properties of irregularly shaped particles are represented by the average properties of an ensemble of particles with simple shapes. We show that the absorption cross section of an ensemble of arbitrarily shaped particles with arbitrary orientations can always be uniquely represented by the average absorption cross section of an ensemble of spheroidal particles with the same composition and fixed orientation. This proves for the first time that the statistical approach is generally viable in the Rayleigh domain

  20. Conduction at domain walls in oxide multiferroics

    Science.gov (United States)

    Seidel, J.; Martin, L. W.; He, Q.; Zhan, Q.; Chu, Y.-H.; Rother, A.; Hawkridge, M. E.; Maksymovych, P.; Yu, P.; Gajek, M.; Balke, N.; Kalinin, S. V.; Gemming, S.; Wang, F.; Catalan, G.; Scott, J. F.; Spaldin, N. A.; Orenstein, J.; Ramesh, R.

    2009-03-01

    Domain walls may play an important role in future electronic devices, given their small size as well as the fact that their location can be controlled. Here, we report the observation of room-temperature electronic conductivity at ferroelectric domain walls in the insulating multiferroic BiFeO3. The origin and nature of the observed conductivity are probed using a combination of conductive atomic force microscopy, high-resolution transmission electron microscopy and first-principles density functional computations. Our analyses indicate that the conductivity correlates with structurally driven changes in both the electrostatic potential and the local electronic structure, which shows a decrease in the bandgap at the domain wall. Additionally, we demonstrate the potential for device applications of such conducting nanoscale features.

  1. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    Directory of Open Access Journals (Sweden)

    Karlheinz Schwarz

    2013-09-01

    Full Text Available Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized below. In each section a further focusing will be provided by occasionally organizing special issues on topics of high interests, collecting papers on fundamental work in the field. More applied papers should be submitted to their corresponding specialist journals. To help us achieve our goal with this journal, we have an excellent editorial board to advise us on the exciting current and future trends in computation from methodology to application. We very much look forward to hearing all about the research going on across the world. [...

  2. Scalable Domain Decomposed Monte Carlo Particle Transport

    Energy Technology Data Exchange (ETDEWEB)

    O' Brien, Matthew Joseph [Univ. of California, Davis, CA (United States)

    2013-12-05

    In this dissertation, we present the parallel algorithms necessary to run domain decomposed Monte Carlo particle transport on large numbers of processors (millions of processors). Previous algorithms were not scalable, and the parallel overhead became more computationally costly than the numerical simulation.

  3. Cross-Sectional Associations between Home Environmental Factors and Domain-Specific Sedentary Behaviors in Adults: The Moderating Role of Socio-Demographic Variables and BMI.

    Science.gov (United States)

    Compernolle, Sofie; Busschaert, Cedric; De Bourdeaudhuij, Ilse; Cardon, Greet; Chastin, Sebastien F M; Van Cauwenberg, Jelle; De Cocker, Katrien

    2017-10-31

    Despite the negative health effects of too much sitting, the majority of adults are too sedentary. To develop effective interventions, insight is needed into home environmental correlates of adults' sedentary behaviors, and into the susceptibility of population subgroups to these home environmental cues. In total, 559 Flemish adults reported socio-demographics, weight and height, home environmental factors and domain-specific sedentary behaviors. Generalized linear modeling was conducted to examine main associations between home environmental factors and domain-specific sedentary behaviors, and to test the moderating role of socio-demographics and BMI on these associations. In case of significant interactions, stratified analyses were performed. Results showed that, among those who did use a computer/laptop during the last week, a one-unit increase in the number of computers or laptops was associated with 17% (OR = 1.17; 95% CI = 1.02, 1.34) and 24% (OR = 1.24; 95% CI = 1.08, 1.43) more minutes computer time per day, respectively. The proximity of the remote controller ( p vehicles (95% CI = 0.001, 0.12) was positively associated with the odds of participation in transport-related sitting time. The latter two associations were moderated by BMI, with significant positive associations limited to those not overweight. To conclude, home environmental factors were associated with domain-specific sedentary behaviors, especially in healthy weight adults. If confirmed by longitudinal studies, public health professionals should encourage adults to limit the number of indoor entertainment devices and motorized vehicles.

  4. Generalized predictive control in the delta-domain

    DEFF Research Database (Denmark)

    Lauritsen, Morten Bach; Jensen, Morten Rostgaard; Poulsen, Niels Kjølstad

    1995-01-01

    This paper describes new approaches to generalized predictive control formulated in the delta (δ) domain. A new δ-domain version of the continuous-time emulator-based predictor is presented. It produces the optimal estimate in the deterministic case whenever the predictor order is chosen greater...... than or equal to the number of future predicted samples, however a “good” estimate is usually obtained in a much longer range of samples. This is particularly advantageous at fast sampling rates where a “conventional” predictor is bound to become very computationally demanding. Two controllers...

  5. Psychometric characteristics of a public-domain self-report measure of vocational interests: the Oregon Vocational Interest Scales.

    Science.gov (United States)

    Pozzebon, Julie A; Visser, Beth A; Ashton, Michael C; Lee, Kibeom; Goldberg, Lewis R

    2010-03-01

    We investigated the psychometric properties of the Oregon Vocational Interest Scales (ORVIS), a brief public-domain alternative to commercial inventories, in a large community sample and in a college sample. In both samples, we examined the factor structure, scale intercorrelations, and personality correlates of the ORVIS, and in the community sample, we also examined the correlations of the ORVIS scales with cognitive abilities and with the scales of a longer, proprietary interest survey. In both samples, all 8 scales-Leadership, Organization, Altruism, Creativity, Analysis, Producing, Adventuring, and Erudition-showed wide variation in scores, high internal-consistency reliabilities, and a pattern of high convergent and low discriminant correlations with the scales of the proprietary interest survey. Overall, the results support the construct validity of the scales, which are recommended for use in research on vocational interests and other individual differences.

  6. La apropiación del dominio público y las posibilidades de acceso a los bienes culturales | The appropriation of the public domain and the possibilities of access to cultural goods

    Directory of Open Access Journals (Sweden)

    Joan Ramos Toledano

    2017-06-01

    Full Text Available Resumen: Las normas de propiedad intelectual y copyright prevén un periodo de protección otorgando unos derechos económicos exclusivos y temporales. Pasado un plazo determinado, las obras protegidas entran en lo que se denomina dominio público. Éste suele ser considerado como el momento en el que los bienes culturales pasan a estar bajo el dominio y control de la sociedad en conjunto. El presente trabajo pretende argumentar que, dado nuestro actual sistema económico, en realidad el dominio público funciona más como una posibilidad de negocio para determinadas empresas que como una verdadera opción para que el público pueda acceder a las obras.   Abstract: The legislation of continental intellectual property and copyright provide for a period of protection granting exclusive and temporary economic rights. After a certain period, protected works enter into what is called the public domain. This is often considered as the moment in which the cultural goods come under the control and domain of society as a whole. The present paper pretends to argue that, given our current economic system, the public domain actually functions more as a business opportunity for certain companies than as a real option for the public to access artistic and intellectual works.

  7. Time-Domain Simulation of RF Couplers

    International Nuclear Information System (INIS)

    Smithe, David; Carlsson, Johan; Austin, Travis

    2009-01-01

    We have developed a finite-difference time-domain (FDTD) fluid-like approach to integrated plasma-and-coupler simulation [1], and show how it can be used to model LH and ICRF couplers in the MST and larger tokamaks.[2] This approach permits very accurate 3-D representation of coupler geometry, and easily includes non-axi-symmetry in vessel wall, magnetic equilibrium, and plasma density. The plasma is integrated with the FDTD Maxwell solver in an implicit solve that steps over electron time-scales, and permits tenuous plasma in the coupler itself, without any need to distinguish or interface between different regions of vacuum and/or plasma. The FDTD algorithm is also generalized to incorporate a time-domain sheath potential [3] on metal structures within the simulation, to look for situations where the sheath potential might generate local sputtering opportunities. Benchmarking of the time-domain sheath algorithm has been reported in the references. Finally, the time-domain software [4] permits the use of particles, either as field diagnostic (test particles) or to self-consistently compute plasma current from the applied RF power.

  8. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  9. Selected ICAR Data from the SAPA-Project: Development and Initial Validation of a Public-Domain Measure

    Directory of Open Access Journals (Sweden)

    David M. Condon

    2016-01-01

    Full Text Available These data were collected during the initial evaluation of the International Cognitive Ability Resource (ICAR project. ICAR is an international collaborative effort to develop open-source public-domain tools for cognitive ability assessment, including tools that can be administered in non-proctored environments (e.g., online administration and those which are based on automatic item generation algorithms. These data provide initial validation of the first four ICAR item types as reported in Condon & Revelle [1]. The 4 item types contain a total of 60 items: 9 Letter and Number Series items, 11 Matrix Reasoning items, 16 Verbal Reasoning items and 24 Three-dimensional Rotation items. Approximately 97,000 individuals were administered random subsets of these 60 items using the Synthetic Aperture Personality Assessment method between August 18, 2010 and May 20, 2013. The data are available in rdata and csv formats and are accompanied by documentation stored as a text file. Re-use potential includes a wide range of structural and item-level analyses.

  10. ICRP Publication 116—the first ICRP/ICRU application of the male and female adult reference computational phantoms

    CERN Document Server

    Petoussi-Henss, Nina; Eckerman, Keith F; Endo, Akira; Hertel, Nolan; Hunt, John; Menzel, Hans G; Pelliccioni, Maurizio; Schlattl, Helmut; Zankl, Maria

    2014-01-01

    ICRP Publication 116 on `Conversion coefficients for radiological protection quantities for external radiation exposures', provides fluence-to-dose conversion coefficients for organ-absorbed doses and effective dose for various types of external exposures (ICRP 2010 ICRP Publication 116). The publication supersedes the ICRP Publication 74 (ICRP 1996 ICRP Publication 74, ICRU 1998 ICRU Report 57), including new particle types and expanding the energy ranges considered. The coefficients were calculated using the ICRP/ICRU computational phantoms (ICRP 2009 ICRP Publication 110) representing the reference adult male and reference adult female (ICRP 2002 ICRP Publication 89), together with a variety of Monte Carlo codes simulating the radiation transport in the body. Idealized whole-body irradiation from unidirectional and rotational parallel beams as well as isotropic irradiation was considered for a large variety of incident radiations and energy ranges. Comparison of the effective doses with operational quantit...

  11. Ubiquitin domain proteins in disease

    DEFF Research Database (Denmark)

    Klausen, Louise Kjær; Schulze, Andrea; Seeger, Michael

    2007-01-01

    The human genome encodes several ubiquitin-like (UBL) domain proteins (UDPs). Members of this protein family are involved in a variety of cellular functions and many are connected to the ubiquitin proteasome system, an essential pathway for protein degradation in eukaryotic cells. Despite...... and cancer. Publication history: Republished from Current BioData's Targeted Proteins database (TPdb; http://www.targetedproteinsdb.com)....

  12. Ten years for the public Web

    CERN Multimedia

    2003-01-01

    Ten years ago, CERN issued a statement declaring that a little known piece of software called the World Wide Web was in the public domain. Nowadays, the Web is an indispensable part of modern communications. The idea for the Web goes back to March 1989 when CERN Computer scientist Tim Berners-Lee wrote a proposal for a 'Distributed Information Management System' for the high-energy physics community. The Web was originaly conceived and developed to meet the demand for information sharing between scientists working all over the world. There were many obstacles in the 1980s to the effective exchange of information. There was, for example a great variety of computer and network systems, with hardly any common features. The main purpose of the web was to allow scientists to access information from any source in a consistent and simple way. By Christmas 1990, Berners-Lee's idea had become the World Wide Web, with its first server and browser running at CERN. Through 1991, the Web spread to other particle physics ...

  13. Advanced computational electromagnetic methods and applications

    CERN Document Server

    Li, Wenxing; Elsherbeni, Atef; Rahmat-Samii, Yahya

    2015-01-01

    This new resource covers the latest developments in computational electromagnetic methods, with emphasis on cutting-edge applications. This book is designed to extend existing literature to the latest development in computational electromagnetic methods, which are of interest to readers in both academic and industrial areas. The topics include advanced techniques in MoM, FEM and FDTD, spectral domain method, GPU and Phi hardware acceleration, metamaterials, frequency and time domain integral equations, and statistics methods in bio-electromagnetics.

  14. Computational Study of Correlated Domain Motions in the AcrB Efflux Transporter

    Directory of Open Access Journals (Sweden)

    Robert Schulz

    2015-01-01

    Full Text Available As active part of the major efflux system in E. coli bacteria, AcrB is responsible for the uptake and pumping of toxic substrates from the periplasm toward the extracellular space. In combination with the channel protein TolC and membrane fusion protein AcrA, this efflux pump is able to help the bacterium to survive different kinds of noxious compounds. With the present study we intend to enhance the understanding of the interactions between the domains and monomers, for example, the transduction of mechanical energy from the transmembrane domain into the porter domain, correlated motions of different subdomains within monomers, and cooperative effects between monomers. To this end, targeted molecular dynamics simulations have been employed either steering the whole protein complex or specific parts thereof. By forcing only parts of the complex towards specific conformational states, the risk for transient artificial conformations during the simulations is reduced. Distinct cooperative effects between the monomers in AcrB have been observed. Possible allosteric couplings have been identified providing microscopic insights that might be exploited to design more efficient inhibitors of efflux systems.

  15. A method for the automated, reliable retrieval of publication-citation records.

    Directory of Open Access Journals (Sweden)

    Derek Ruths

    Full Text Available BACKGROUND: Publication records and citation indices often are used to evaluate academic performance. For this reason, obtaining or computing them accurately is important. This can be difficult, largely due to a lack of complete knowledge of an individual's publication list and/or lack of time available to manually obtain or construct the publication-citation record. While online publication search engines have somewhat addressed these problems, using raw search results can yield inaccurate estimates of publication-citation records and citation indices. METHODOLOGY: In this paper, we present a new, automated method that produces estimates of an individual's publication-citation record from an individual's name and a set of domain-specific vocabulary that may occur in the individual's publication titles. Because this vocabulary can be harvested directly from a research web page or online (partial publication list, our method delivers an easy way to obtain estimates of a publication-citation record and the relevant citation indices. Our method works by applying a series of stringent name and content filters to the raw publication search results returned by an online publication search engine. In this paper, our method is run using Google Scholar, but the underlying filters can be easily applied to any existing publication search engine. When compared against a manually constructed data set of individuals and their publication-citation records, our method provides significant improvements over raw search results. The estimated publication-citation records returned by our method have an average sensitivity of 98% and specificity of 72% (in contrast to raw search result specificity of less than 10%. When citation indices are computed using these records, the estimated indices are within of the true value 10%, compared to raw search results which have overestimates of, on average, 75%. CONCLUSIONS: These results confirm that our method provides

  16. The Education Value of Cloud Computing

    Science.gov (United States)

    Katzan, Harry, Jr.

    2010-01-01

    Cloud computing is a technique for supplying computer facilities and providing access to software via the Internet. Cloud computing represents a contextual shift in how computers are provisioned and accessed. One of the defining characteristics of cloud software service is the transfer of control from the client domain to the service provider.…

  17. Multilevel domain decomposition for electronic structure calculations

    International Nuclear Information System (INIS)

    Barrault, M.; Cances, E.; Hager, W.W.; Le Bris, C.

    2007-01-01

    We introduce a new multilevel domain decomposition method (MDD) for electronic structure calculations within semi-empirical and density functional theory (DFT) frameworks. This method iterates between local fine solvers and global coarse solvers, in the spirit of domain decomposition methods. Using this approach, calculations have been successfully performed on several linear polymer chains containing up to 40,000 atoms and 200,000 atomic orbitals. Both the computational cost and the memory requirement scale linearly with the number of atoms. Additional speed-up can easily be obtained by parallelization. We show that this domain decomposition method outperforms the density matrix minimization (DMM) method for poor initial guesses. Our method provides an efficient preconditioner for DMM and other linear scaling methods, variational in nature, such as the orbital minimization (OM) procedure

  18. Time Domain Partitioning of Electricity Production Cost Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Barrows, C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hummon, M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Jones, W. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hale, E. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2014-01-01

    Production cost models are often used for planning by simulating power system operations over long time horizons. The simulation of a day-ahead energy market can take several weeks to compute. Tractability improvements are often made through model simplifications, such as: reductions in transmission modeling detail, relaxation of commitment variable integrality, reductions in cost modeling detail, etc. One common simplification is to partition the simulation horizon so that weekly or monthly horizons can be simulated in parallel. However, horizon partitions are often executed with overlap periods of arbitrary and sometimes zero length. We calculate the time domain persistence of historical unit commitment decisions to inform time domain partitioning of production cost models. The results are implemented using PLEXOS production cost modeling software in an HPC environment to improve the computation time of simulations while maintaining solution integrity.

  19. Bioinformatics process management: information flow via a computational journal

    Directory of Open Access Journals (Sweden)

    Lushington Gerald

    2007-12-01

    Full Text Available Abstract This paper presents the Bioinformatics Computational Journal (BCJ, a framework for conducting and managing computational experiments in bioinformatics and computational biology. These experiments often involve series of computations, data searches, filters, and annotations which can benefit from a structured environment. Systems to manage computational experiments exist, ranging from libraries with standard data models to elaborate schemes to chain together input and output between applications. Yet, although such frameworks are available, their use is not widespread–ad hoc scripts are often required to bind applications together. The BCJ explores another solution to this problem through a computer based environment suitable for on-site use, which builds on the traditional laboratory notebook paradigm. It provides an intuitive, extensible paradigm designed for expressive composition of applications. Extensive features facilitate sharing data, computational methods, and entire experiments. By focusing on the bioinformatics and computational biology domain, the scope of the computational framework was narrowed, permitting us to implement a capable set of features for this domain. This report discusses the features determined critical by our system and other projects, along with design issues. We illustrate the use of our implementation of the BCJ on two domain-specific examples.

  20. Modes of Interaction of Pleckstrin Homology Domains with Membranes: Toward a Computational Biochemistry of Membrane Recognition.

    Science.gov (United States)

    Naughton, Fiona B; Kalli, Antreas C; Sansom, Mark S P

    2018-02-02

    Pleckstrin homology (PH) domains mediate protein-membrane interactions by binding to phosphatidylinositol phosphate (PIP) molecules. The structural and energetic basis of selective PH-PIP interactions is central to understanding many cellular processes, yet the molecular complexities of the PH-PIP interactions are largely unknown. Molecular dynamics simulations using a coarse-grained model enables estimation of free-energy landscapes for the interactions of 12 different PH domains with membranes containing PIP 2 or PIP 3 , allowing us to obtain a detailed molecular energetic understanding of the complexities of the interactions of the PH domains with PIP molecules in membranes. Distinct binding modes, corresponding to different distributions of cationic residues on the PH domain, were observed, involving PIP interactions at either the "canonical" (C) and/or "alternate" (A) sites. PH domains can be grouped by the relative strength of their C- and A-site interactions, revealing that a higher affinity correlates with increased C-site interactions. These simulations demonstrate that simultaneous binding of multiple PIP molecules by PH domains contributes to high-affinity membrane interactions, informing our understanding of membrane recognition by PH domains in vivo. Copyright © 2017. Published by Elsevier Ltd.

  1. Simulation of power fluctuation of wind farms based on frequency domain

    DEFF Research Database (Denmark)

    Lin, Jin; Sun, Yuanzhang; Li, Guojie

    2011-01-01

    The wind power fluctuation model built up in the frequency domain is mathematically equivalent with that in the time domain, and has a clearer physical meaning therefore describes the fluctuation more accurately. However, the simulation of this model is required to deal with the time......-frequency transformation related to the power spectrum density (PSD), which is more special and complicated than normal transformations. Meanwhile, the computational complexity also increases significantly, more computation resources are needed. These problems negatively affect the engineering application of the model....... To overcome these disadvantages, the physical meaning of PSD based on fundamental concepts is presented, so that the specialties of this model compared with conventional ones can be understood. Then the time-frequency transformation algorithm is derived, which is fast to be implemented in digital computers...

  2. Domain decomposition methods for core calculations using the MINOS solver

    International Nuclear Information System (INIS)

    Guerin, P.; Baudron, A. M.; Lautard, J. J.

    2007-01-01

    Cell by cell homogenized transport calculations of an entire nuclear reactor core are currently too expensive for industrial applications, even if a simplified transport (SPn) approximation is used. In order to take advantage of parallel computers, we propose here two domain decomposition methods using the mixed dual finite element solver MINOS. The first one is a modal synthesis method on overlapping sub-domains: several Eigenmodes solutions of a local problem on each sub-domain are taken as basis functions used for the resolution of the global problem on the whole domain. The second one is an iterative method based on non-overlapping domain decomposition with Robin interface conditions. At each iteration, we solve the problem on each sub-domain with the interface conditions given by the solutions on the close sub-domains estimated at the previous iteration. For these two methods, we give numerical results which demonstrate their accuracy and their efficiency for the diffusion model on realistic 2D and 3D cores. (authors)

  3. Information Dissemination of Public Health Emergency on Social Networks and Intelligent Computation

    Directory of Open Access Journals (Sweden)

    Hongzhi Hu

    2015-01-01

    Full Text Available Due to the extensive social influence, public health emergency has attracted great attention in today’s society. The booming social network is becoming a main information dissemination platform of those events and caused high concerns in emergency management, among which a good prediction of information dissemination in social networks is necessary for estimating the event’s social impacts and making a proper strategy. However, information dissemination is largely affected by complex interactive activities and group behaviors in social network; the existing methods and models are limited to achieve a satisfactory prediction result due to the open changeable social connections and uncertain information processing behaviors. ACP (artificial societies, computational experiments, and parallel execution provides an effective way to simulate the real situation. In order to obtain better information dissemination prediction in social networks, this paper proposes an intelligent computation method under the framework of TDF (Theory-Data-Feedback based on ACP simulation system which was successfully applied to the analysis of A (H1N1 Flu emergency.

  4. Semiotics, Information Science, Documents and Computers.

    Science.gov (United States)

    Warner, Julian

    1990-01-01

    Discusses the relationship and value of semiotics to the established domains of information science. Highlights include documentation; computer operations; the language of computing; automata theory; linguistics; speech and writing; and the written language as a unifying principle for the document and the computer. (93 references) (LRW)

  5. Spatiotemporal Data Mining: A Computational Perspective

    Directory of Open Access Journals (Sweden)

    Shashi Shekhar

    2015-10-01

    Full Text Available Explosive growth in geospatial and temporal data as well as the emergence of new technologies emphasize the need for automated discovery of spatiotemporal knowledge. Spatiotemporal data mining studies the process of discovering interesting and previously unknown, but potentially useful patterns from large spatiotemporal databases. It has broad application domains including ecology and environmental management, public safety, transportation, earth science, epidemiology, and climatology. The complexity of spatiotemporal data and intrinsic relationships limits the usefulness of conventional data science techniques for extracting spatiotemporal patterns. In this survey, we review recent computational techniques and tools in spatiotemporal data mining, focusing on several major pattern families: spatiotemporal outlier, spatiotemporal coupling and tele-coupling, spatiotemporal prediction, spatiotemporal partitioning and summarization, spatiotemporal hotspots, and change detection. Compared with other surveys in the literature, this paper emphasizes the statistical foundations of spatiotemporal data mining and provides comprehensive coverage of computational approaches for various pattern families. ISPRS Int. J. Geo-Inf. 2015, 4 2307 We also list popular software tools for spatiotemporal data analysis. The survey concludes with a look at future research needs.

  6. Environmental computing compendium - background and motivation

    Science.gov (United States)

    Heikkurinen, Matti; Kranzlmüller, Dieter

    2017-04-01

    The emerging discipline of environmental computing brings together experts in applied, advanced environmental modelling. The application domains address several fundamental societal challenges, ranging from disaster risk reduction to sustainability issues (such as food security on the global scale). The community has used an Intuitive, pragmatic approach when determining which initiatives are considered to "belong to the discipline". The community's growth is based on sharing of experiences and tools provides opportunities for reusing solutions or applying knowledge in new settings. Thus, limiting possible synergies by applying an arbitrary, formal definition to exclude some of the sources of solutions and knowledge would be counterproductive. However, the number of individuals and initiatives involved has grown to the level where a survey of initiatives and sub-themes they focus on is of interest. By surveying the project landscape and identifying common themes and building a shared vocabulary to describe them we can both communicate the relevance of the new discipline to the general public more easily and make it easier for the new members of the community to find the most promising collaboration partners. This talk presents the methodology and initial findings of the initial survey of the environmental computing initiatives and organisations, as well as approaches that could lead to an environmental computing compendium that would be a collaborative maintained shared resource of the environmental computing community.

  7. Domain-Specific and Domain-General Training to Improve Kindergarten Children’s Mathematics

    Directory of Open Access Journals (Sweden)

    Geetha B. Ramani

    2017-12-01

    Full Text Available Ensuring that kindergarten children have a solid foundation in early numerical knowledge is of critical importance for later mathematical achievement. In this study, we targeted improving the numerical knowledge of kindergarteners (n = 81 from primarily low-income backgrounds using two approaches: one targeting their conceptual knowledge, specifically, their understanding of numerical magnitudes; and the other targeting their underlying cognitive system, specifically, their working memory. Both interventions involved playing game-like activities on tablet computers over the course of several sessions. As predicted, both interventions improved children’s numerical magnitude knowledge as compared to a no-contact control group, suggesting that both domain-specific and domain-general interventions facilitate mathematical learning. Individual differences in effort during the working memory game, but not the number knowledge training game predicted children’s improvements in number line estimation. The results demonstrate the potential of using a rapidly growing technology in early childhood classrooms to promote young children’s numerical knowledge.

  8. Effective Domain Partitioning for Multi-Clock Domain IP Core Wrapper Design under Power Constraints

    Science.gov (United States)

    Yu, Thomas Edison; Yoneda, Tomokazu; Zhao, Danella; Fujiwara, Hideo

    The rapid advancement of VLSI technology has made it possible for chip designers and manufacturers to embed the components of a whole system onto a single chip, called System-on-Chip or SoC. SoCs make use of pre-designed modules, called IP-cores, which provide faster design time and quicker time-to-market. Furthermore, SoCs that operate at multiple clock domains and very low power requirements are being utilized in the latest communications, networking and signal processing devices. As a result, the testing of SoCs and multi-clock domain embedded cores under power constraints has been rapidly gaining importance. In this research, a novel method for designing power-aware test wrappers for embedded cores with multiple clock domains is presented. By effectively partitioning the various clock domains, we are able to increase the solution space of possible test schedules for the core. Since previous methods were limited to concurrently testing all the clock domains, we effectively remove this limitation by making use of bandwidth conversion, multiple shift frequencies and properly gating the clock signals to control the shift activity of various core logic elements. The combination of the above techniques gains us greater flexibility when determining an optimal test schedule under very tight power constraints. Furthermore, since it is computationally intensive to search the entire expanded solution space for the possible test schedules, we propose a heuristic 3-D bin packing algorithm to determine the optimal wrapper architecture and test schedule while minimizing the test time under power and bandwidth constraints.

  9. Time-domain Green's Function Method for three-dimensional nonlinear subsonic flows

    Science.gov (United States)

    Tseng, K.; Morino, L.

    1978-01-01

    The Green's Function Method for linearized 3D unsteady potential flow (embedded in the computer code SOUSSA P) is extended to include the time-domain analysis as well as the nonlinear term retained in the transonic small disturbance equation. The differential-delay equations in time, as obtained by applying the Green's Function Method (in a generalized sense) and the finite-element technique to the transonic equation, are solved directly in the time domain. Comparisons are made with both linearized frequency-domain calculations and existing nonlinear results.

  10. A Joint Method of Envelope Inversion Combined with Hybrid-domain Full Waveform Inversion

    Science.gov (United States)

    CUI, C.; Hou, W.

    2017-12-01

    Full waveform inversion (FWI) aims to construct high-precision subsurface models by fully using the information in seismic records, including amplitude, travel time, phase and so on. However, high non-linearity and the absence of low frequency information in seismic data lead to the well-known cycle skipping problem and make inversion easily fall into local minima. In addition, those 3D inversion methods that are based on acoustic approximation ignore the elastic effects in real seismic field, and make inversion harder. As a result, the accuracy of final inversion results highly relies on the quality of initial model. In order to improve stability and quality of inversion results, multi-scale inversion that reconstructs subsurface model from low to high frequency are applied. But, the absence of very low frequencies (time domain and inversion in the frequency domain. To accelerate the inversion, we adopt CPU/GPU heterogeneous computing techniques. There were two levels of parallelism. In the first level, the inversion tasks are decomposed and assigned to each computation node by shot number. In the second level, GPU multithreaded programming is used for the computation tasks in each node, including forward modeling, envelope extraction, DFT (discrete Fourier transform) calculation and gradients calculation. Numerical tests demonstrated that the combined envelope inversion + hybrid-domain FWI could obtain much faithful and accurate result than conventional hybrid-domain FWI. The CPU/GPU heterogeneous parallel computation could improve the performance speed.

  11. Characteristics of scientific web publications

    DEFF Research Database (Denmark)

    Thorlund Jepsen, Erik; Seiden, Piet; Ingwersen, Peter Emil Rerup

    2004-01-01

    were generated based on specifically selected domain topics that are searched for in three publicly accessible search engines (Google, AllTheWeb, and AltaVista). A sample of the retrieved hits was analyzed with regard to how various publication attributes correlated with the scientific quality...... of the content and whether this information could be employed to harvest, filter, and rank Web publications. The attributes analyzed were inlinks, outlinks, bibliographic references, file format, language, search engine overlap, structural position (according to site structure), and the occurrence of various...... types of metadata. As could be expected, the ranked output differs between the three search engines. Apparently, this is caused by differences in ranking algorithms rather than the databases themselves. In fact, because scientific Web content in this subject domain receives few inlinks, both Alta...

  12. Simulation of two-phase flows by domain decomposition

    International Nuclear Information System (INIS)

    Dao, T.H.

    2013-01-01

    This thesis deals with numerical simulations of compressible fluid flows by implicit finite volume methods. Firstly, we studied and implemented an implicit version of the Roe scheme for compressible single-phase and two-phase flows. Thanks to Newton method for solving nonlinear systems, our schemes are conservative. Unfortunately, the resolution of nonlinear systems is very expensive. It is therefore essential to use an efficient algorithm to solve these systems. For large size matrices, we often use iterative methods whose convergence depends on the spectrum. We have studied the spectrum of the linear system and proposed a strategy, called Scaling, to improve the condition number of the matrix. Combined with the classical ILU pre-conditioner, our strategy has reduced significantly the GMRES iterations for local systems and the computation time. We also show some satisfactory results for low Mach-number flows using the implicit centered scheme. We then studied and implemented a domain decomposition method for compressible fluid flows. We have proposed a new interface variable which makes the Schur complement method easy to build and allows us to treat diffusion terms. Using GMRES iterative solver rather than Richardson for the interface system also provides a better performance compared to other methods. We can also decompose the computational domain into any number of sub-domains. Moreover, the Scaling strategy for the interface system has improved the condition number of the matrix and reduced the number of GMRES iterations. In comparison with the classical distributed computing, we have shown that our method is more robust and efficient. (author) [fr

  13. Domain decomposition methods for the mixed dual formulation of the critical neutron diffusion problem; Methodes de decomposition de domaine pour la formulation mixte duale du probleme critique de la diffusion des neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Guerin, P

    2007-12-15

    The neutronic simulation of a nuclear reactor core is performed using the neutron transport equation, and leads to an eigenvalue problem in the steady-state case. Among the deterministic resolution methods, diffusion approximation is often used. For this problem, the MINOS solver based on a mixed dual finite element method has shown his efficiency. In order to take advantage of parallel computers, and to reduce the computing time and the local memory requirement, we propose in this dissertation two domain decomposition methods for the resolution of the mixed dual form of the eigenvalue neutron diffusion problem. The first approach is a component mode synthesis method on overlapping sub-domains. Several Eigenmodes solutions of a local problem solved by MINOS on each sub-domain are taken as basis functions used for the resolution of the global problem on the whole domain. The second approach is a modified iterative Schwarz algorithm based on non-overlapping domain decomposition with Robin interface conditions. At each iteration, the problem is solved on each sub domain by MINOS with the interface conditions deduced from the solutions on the adjacent sub-domains at the previous iteration. The iterations allow the simultaneous convergence of the domain decomposition and the eigenvalue problem. We demonstrate the accuracy and the efficiency in parallel of these two methods with numerical results for the diffusion model on realistic 2- and 3-dimensional cores. (author)

  14. Architecture for time or transform domain decoding of reed-solomon codes

    Science.gov (United States)

    Shao, Howard M. (Inventor); Truong, Trieu-Kie (Inventor); Hsu, In-Shek (Inventor); Deutsch, Leslie J. (Inventor)

    1989-01-01

    Two pipeline (255,233) RS decoders, one a time domain decoder and the other a transform domain decoder, use the same first part to develop an errata locator polynomial .tau.(x), and an errata evaluator polynominal A(x). Both the time domain decoder and transform domain decoder have a modified GCD that uses an input multiplexer and an output demultiplexer to reduce the number of GCD cells required. The time domain decoder uses a Chien search and polynomial evaluator on the GCD outputs .tau.(x) and A(x), for the final decoding steps, while the transform domain decoder uses a transform error pattern algorithm operating on .tau.(x) and the initial syndrome computation S(x), followed by an inverse transform algorithm in sequence for the final decoding steps prior to adding the received RS coded message to produce a decoded output message.

  15. Response to a widespread, unauthorized dispersal of radioactive waste in the public domain

    International Nuclear Information System (INIS)

    Wenslawski, F.A.; North, H.S.

    1979-01-01

    In March 1976 State of Nevada radiological health officials became aware that radioactive items destined for disposal at a radioactive waste burial facility near Beatty, Nevada had instead been distributed to wide segments of the public domain. Because the facility was jointly licensed by the State of Nevada and the Federal Nuclear Regulatory Commission, both agencies quickly responded. It was learned that over a period of several years a practice existed at the disposal facility of opening containers, removing contents and allowing employees to take items of worth or fancy. Numerous items such as hand tools, electric motors, laboratory instruments, shipping containers, etc., had received widespread and uncontrolled distribution in the town of Beatty as well as lesser distributions to other locations. Because the situation might have had the potential for a significant health and safety impact, a comprehensive recovery operation was conducted. During the course of seven days of intense effort, thirty-five individuals became involved in a comprehensive door by door survey and search of the town. Aerial surveys were performed using a helicopter equipped with sensitive radiation detectors, while ground level scans were conducted using a van containing similar instrumentation. Aerial reconnaissance photographs were taken, a special town meeting was held and numerous persons were interviewed. The recovery effort resulted in a retrieval of an estimated 20 to 25 pickup truck loads of radioactively contaminated equipment as well as several loads of large items returned on a 40-foot flatbed trailer

  16. Use of media and public-domain Internet sources for detection and assessment of plant health threats.

    Science.gov (United States)

    Thomas, Carla S; Nelson, Noele P; Jahn, Gary C; Niu, Tianchan; Hartley, David M

    2011-09-05

    Event-based biosurveillance is a recognized approach to early warning and situational awareness of emerging health threats. In this study, we build upon previous human and animal health work to develop a new approach to plant pest and pathogen surveillance. We show that monitoring public domain electronic media for indications and warning of epidemics and associated social disruption can provide information about the emergence and progression of plant pest infestation or disease outbreak. The approach is illustrated using a case study, which describes a plant pest and pathogen epidemic in China and Vietnam from February 2006 to December 2007, and the role of ducks in contributing to zoonotic virus spread in birds and humans. This approach could be used as a complementary method to traditional plant pest and pathogen surveillance to aid global and national plant protection officials and political leaders in early detection and timely response to significant biological threats to plant health, economic vitality, and social stability. This study documents the inter-relatedness of health in human, animal, and plant populations and emphasizes the importance of plant health surveillance.

  17. Progress in parallel implementation of the multilevel plane wave time domain algorithm

    KAUST Repository

    Liu, Yang

    2013-07-01

    The computational complexity and memory requirements of classical schemes for evaluating transient electromagnetic fields produced by Ns dipoles active for Nt time steps scale as O(NtN s 2) and O(Ns 2), respectively. The multilevel plane wave time domain (PWTD) algorithm [A.A. Ergin et al., Antennas and Propagation Magazine, IEEE, vol. 41, pp. 39-52, 1999], viz. the extension of the frequency domain fast multipole method (FMM) to the time domain, reduces the above costs to O(NtNslog2Ns) and O(Ns α) with α = 1.5 for surface current distributions and α = 4/3 for volumetric ones. Its favorable computational and memory costs notwithstanding, serial implementations of the PWTD scheme unfortunately remain somewhat limited in scope and ill-suited to tackle complex real-world scattering problems, and parallel implementations are called for. © 2013 IEEE.

  18. Stimulated Emission Computed Tomography (NSECT) images enhancement using a linear filter in the frequency domain

    Energy Technology Data Exchange (ETDEWEB)

    Viana, Rodrigo S.S.; Tardelli, Tiago C.; Yoriyaz, Helio, E-mail: hyoriyaz@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Jackowski, Marcel P., E-mail: mjack@ime.usp.b [University of Sao Paulo (USP), SP (Brazil). Dept. of Computer Science

    2011-07-01

    In recent years, a new technique for in vivo spectrographic imaging of stable isotopes was presented as Neutron Stimulated Emission Computed Tomography (NSECT). In this technique, a fast neutrons beam stimulates stable nuclei in a sample, which emit characteristic gamma radiation. The photon energy is unique and is used to identify the emitting nuclei. The emitted gamma energy spectra can be used for reconstruction of the target tissue image and for determination of the tissue elemental composition. Due to the stochastic nature of photon emission process by irradiated tissue, one of the most suitable algorithms for tomographic reconstruction is the Expectation-Maximization (E-M) algorithm, once on its formulation are considered simultaneously the probabilities of photons emission and detection. However, a disadvantage of this algorithm is the introduction of noise in the reconstructed image as the number of iterations increases. This increase can be caused either by features of the algorithm itself or by the low sampling rate of projections used for tomographic reconstruction. In this work, a linear filter in the frequency domain was used in order to improve the quality of the reconstructed images. (author)

  19. Stimulated Emission Computed Tomography (NSECT) images enhancement using a linear filter in the frequency domain

    International Nuclear Information System (INIS)

    Viana, Rodrigo S.S.; Tardelli, Tiago C.; Yoriyaz, Helio; Jackowski, Marcel P.

    2011-01-01

    In recent years, a new technique for in vivo spectrographic imaging of stable isotopes was presented as Neutron Stimulated Emission Computed Tomography (NSECT). In this technique, a fast neutrons beam stimulates stable nuclei in a sample, which emit characteristic gamma radiation. The photon energy is unique and is used to identify the emitting nuclei. The emitted gamma energy spectra can be used for reconstruction of the target tissue image and for determination of the tissue elemental composition. Due to the stochastic nature of photon emission process by irradiated tissue, one of the most suitable algorithms for tomographic reconstruction is the Expectation-Maximization (E-M) algorithm, once on its formulation are considered simultaneously the probabilities of photons emission and detection. However, a disadvantage of this algorithm is the introduction of noise in the reconstructed image as the number of iterations increases. This increase can be caused either by features of the algorithm itself or by the low sampling rate of projections used for tomographic reconstruction. In this work, a linear filter in the frequency domain was used in order to improve the quality of the reconstructed images. (author)

  20. Domain decomposition method for solving the neutron diffusion equation

    International Nuclear Information System (INIS)

    Coulomb, F.

    1989-03-01

    The aim of this work is to study methods for solving the neutron diffusion equation; we are interested in methods based on a classical finite element discretization and well suited for use on parallel computers. Domain decomposition methods seem to answer this preoccupation. This study deals with a decomposition of the domain. A theoretical study is carried out for Lagrange finite elements and some examples are given; in the case of mixed dual finite elements, the study is based on examples [fr

  1. Shape of isolated domains in lithium tantalate single crystals at elevated temperatures

    International Nuclear Information System (INIS)

    Shur, V. Ya.; Akhmatkhanov, A. R.; Baturin, I. S.; Chezganov, D. S.; Lobov, A. I.; Smirnov, M. M.

    2013-01-01

    The shape of isolated domains has been investigated in congruent lithium tantalate (CLT) single crystals at elevated temperatures and analyzed in terms of kinetic approach. The obtained temperature dependence of the growing domain shape in CLT including circular shape at temperatures above 190 °C has been attributed to increase of relative input of isotropic ionic conductivity. The observed nonstop wall motion and independent domain growth after merging in CLT as opposed to stoichiometric lithium tantalate have been attributed to difference in wall orientation. The computer simulation has confirmed applicability of the kinetic approach to the domain shape explanation

  2. A role for chromatin topology in imprinted domain regulation.

    Science.gov (United States)

    MacDonald, William A; Sachani, Saqib S; White, Carlee R; Mann, Mellissa R W

    2016-02-01

    Recently, many advancements in genome-wide chromatin topology and nuclear architecture have unveiled the complex and hidden world of the nucleus, where chromatin is organized into discrete neighbourhoods with coordinated gene expression. This includes the active and inactive X chromosomes. Using X chromosome inactivation as a working model, we utilized publicly available datasets together with a literature review to gain insight into topologically associated domains, lamin-associated domains, nucleolar-associating domains, scaffold/matrix attachment regions, and nucleoporin-associated chromatin and their role in regulating monoallelic expression. Furthermore, we comprehensively review for the first time the role of chromatin topology and nuclear architecture in the regulation of genomic imprinting. We propose that chromatin topology and nuclear architecture are important regulatory mechanisms for directing gene expression within imprinted domains. Furthermore, we predict that dynamic changes in chromatin topology and nuclear architecture play roles in tissue-specific imprint domain regulation during early development and differentiation.

  3. SURF: a subroutine code to draw the axonometric projection of a surface generated by a scalar function over a discretized plane domain using finite element computations

    International Nuclear Information System (INIS)

    Giuliani, Giovanni; Giuliani, Silvano.

    1980-01-01

    The FORTRAN IV subroutine SURF has been designed to help visualising the results of Finite Element computations. It drawns the axonometric projection of a surface generated in 3-dimensional space by a scalar function over a discretized plane domain. The most important characteristic of the routine is to remove the hidden lines and in this way it enables a clear vision of the details of the generated surface

  4. Domain decomposition methods for the mixed dual formulation of the critical neutron diffusion problem

    International Nuclear Information System (INIS)

    Guerin, P.

    2007-12-01

    The neutronic simulation of a nuclear reactor core is performed using the neutron transport equation, and leads to an eigenvalue problem in the steady-state case. Among the deterministic resolution methods, diffusion approximation is often used. For this problem, the MINOS solver based on a mixed dual finite element method has shown his efficiency. In order to take advantage of parallel computers, and to reduce the computing time and the local memory requirement, we propose in this dissertation two domain decomposition methods for the resolution of the mixed dual form of the eigenvalue neutron diffusion problem. The first approach is a component mode synthesis method on overlapping sub-domains. Several Eigenmodes solutions of a local problem solved by MINOS on each sub-domain are taken as basis functions used for the resolution of the global problem on the whole domain. The second approach is a modified iterative Schwarz algorithm based on non-overlapping domain decomposition with Robin interface conditions. At each iteration, the problem is solved on each sub domain by MINOS with the interface conditions deduced from the solutions on the adjacent sub-domains at the previous iteration. The iterations allow the simultaneous convergence of the domain decomposition and the eigenvalue problem. We demonstrate the accuracy and the efficiency in parallel of these two methods with numerical results for the diffusion model on realistic 2- and 3-dimensional cores. (author)

  5. Applications of Computer Algebra Conference

    CERN Document Server

    Martínez-Moro, Edgar

    2017-01-01

    The Applications of Computer Algebra (ACA) conference covers a wide range of topics from Coding Theory to Differential Algebra to Quantam Computing, focusing on the interactions of these and other areas with the discipline of Computer Algebra. This volume provides the latest developments in the field as well as its applications in various domains, including communications, modelling, and theoretical physics. The book will appeal to researchers and professors of computer algebra, applied mathematics, and computer science, as well as to engineers and computer scientists engaged in research and development.

  6. Reply to "Domain-growth kinetics of systems with soft walls''

    DEFF Research Database (Denmark)

    Mouritsen, Ole G.; Præstgaard, Eigil

    1988-01-01

    On the basis of computer-simulation results for three different models with soft domain walls it is argued that the zero-temperature domain-growth kinetics falls in a separate universality class characterized by a kinetic growth exponent n≃0.25. However, for finite temperatures there is a distinct...... crossover to Lifshitz-Allen-Cahn kinetics n=0.50, thus suggesting that the soft-wall and hard-wall universality classes become identical at finite temperatures....

  7. Super-Grid Modeling of the Elastic Wave Equation in Semi-Bounded Domains

    Energy Technology Data Exchange (ETDEWEB)

    Petersson, N. Anders; Sjögreen, Björn

    2014-10-01

    Abstract

    We develop a super-grid modeling technique for solving the elastic wave equation in semi-bounded two- and three-dimensional spatial domains. In this method, waves are slowed down and dissipated in sponge layers near the far-field boundaries. Mathematically, this is equivalent to a coordinate mapping that transforms a very large physical domain to a significantly smaller computational domain, where the elastic wave equation is solved numerically on a regular grid. To damp out waves that become poorly resolved because of the coordinate mapping, a high order artificial dissipation operator is added in layers near the boundaries of the computational domain. We prove by energy estimates that the super-grid modeling leads to a stable numerical method with decreasing energy, which is valid for heterogeneous material properties and a free surface boundary condition on one side of the domain. Our spatial discretization is based on a fourth order accurate finite difference method, which satisfies the principle of summation by parts. We show that the discrete energy estimate holds also when a centered finite difference stencil is combined with homogeneous Dirichlet conditions at several ghost points outside of the far-field boundaries. Therefore, the coefficients in the finite difference stencils need only be boundary modified near the free surface. This allows for improved computational efficiency and significant simplifications of the implementation of the proposed method in multi-dimensional domains. Numerical experiments in three space dimensions show that the modeling error from truncating the domain can be made very small by choosing a sufficiently wide super-grid damping layer. The numerical accuracy is first evaluated against analytical solutions of Lamb’s problem, where fourth order accuracy is observed with a sixth order artificial dissipation. We then use successive grid refinements to study the numerical accuracy in the more

  8. Beyond cross-domain learning: Multiple-domain nonnegative matrix factorization

    KAUST Repository

    Wang, Jim Jing-Yan; Gao, Xin

    2014-01-01

    Traditional cross-domain learning methods transfer learning from a source domain to a target domain. In this paper, we propose the multiple-domain learning problem for several equally treated domains. The multiple-domain learning problem assumes that samples from different domains have different distributions, but share the same feature and class label spaces. Each domain could be a target domain, while also be a source domain for other domains. A novel multiple-domain representation method is proposed for the multiple-domain learning problem. This method is based on nonnegative matrix factorization (NMF), and tries to learn a basis matrix and coding vectors for samples, so that the domain distribution mismatch among different domains will be reduced under an extended variation of the maximum mean discrepancy (MMD) criterion. The novel algorithm - multiple-domain NMF (MDNMF) - was evaluated on two challenging multiple-domain learning problems - multiple user spam email detection and multiple-domain glioma diagnosis. The effectiveness of the proposed algorithm is experimentally verified. © 2013 Elsevier Ltd. All rights reserved.

  9. Beyond cross-domain learning: Multiple-domain nonnegative matrix factorization

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-02-01

    Traditional cross-domain learning methods transfer learning from a source domain to a target domain. In this paper, we propose the multiple-domain learning problem for several equally treated domains. The multiple-domain learning problem assumes that samples from different domains have different distributions, but share the same feature and class label spaces. Each domain could be a target domain, while also be a source domain for other domains. A novel multiple-domain representation method is proposed for the multiple-domain learning problem. This method is based on nonnegative matrix factorization (NMF), and tries to learn a basis matrix and coding vectors for samples, so that the domain distribution mismatch among different domains will be reduced under an extended variation of the maximum mean discrepancy (MMD) criterion. The novel algorithm - multiple-domain NMF (MDNMF) - was evaluated on two challenging multiple-domain learning problems - multiple user spam email detection and multiple-domain glioma diagnosis. The effectiveness of the proposed algorithm is experimentally verified. © 2013 Elsevier Ltd. All rights reserved.

  10. Memetic Algorithms, Domain Knowledge, and Financial Investing

    Science.gov (United States)

    Du, Jie

    2012-01-01

    While the question of how to use human knowledge to guide evolutionary search is long-recognized, much remains to be done to answer this question adequately. This dissertation aims to further answer this question by exploring the role of domain knowledge in evolutionary computation as applied to real-world, complex problems, such as financial…

  11. Understanding Situated Social Interactions: A Case Study of Public Places in the City

    DEFF Research Database (Denmark)

    Paay, Jeni; Kjeldskov, Jesper

    2008-01-01

    these and their situated interactions. In response, this paper addresses the challenge of informing design of mobile services for fostering social connections by using the concept of place for studying and understanding peoples’ social activities in a public built environment. We present a case study of social experience...... of a physical place providing an understanding of peoples’ situated social interactions in public places of the city derived through a grounded analysis of small groups of friends socialising out on the town. Informed by this, we describe the design and evaluation of a mobile prototype system facilitating......Ubiquitous and mobile computer technologies are increasingly being appropriated to facilitate people’s social life outside the work domain. Designing such social and collaborative technologies requires an understanding of peoples’ physical and social context, and the interplay between...

  12. Cloud computing and services science

    NARCIS (Netherlands)

    Ivanov, Ivan; van Sinderen, Marten J.; Shishkov, Boris

    2012-01-01

    This book is essentially a collection of the best papers of the International Conference on Cloud Computing and Services Science (CLOSER), which was held in Noordwijkerhout, The Netherlands on May 7–9, 2011. The conference addressed technology trends in the domain of cloud computing in relation to a

  13. Regular periodical public disclosure obligations of public companies

    Directory of Open Access Journals (Sweden)

    Marjanski Vladimir

    2011-01-01

    Full Text Available Public companies in the capacity of capital market participants have the obligation to inform the public on their legal and financial status, their general business operations, as well as on the issuance of securities and other financial instruments. Such obligations may be divided into two groups: The first group consists of regular periodical public disclosures, such as the publication of financial reports (annual, semi-annual and quarterly, and the management's reports on the public company's business operations. The second group comprises the obligation of occasional (ad hoc public disclosure. The thesis analyses the obligation of public companies to inform the public in course of their regular reporting. The new Capital Market Law based on two EU Directives (the Transparency Directive and the Directive on Public Disclosure of Inside Information and the Definition of Market Manipulation regulates such obligation of public companies in substantially more detail than the prior Law on the Market of Securities and Other Financial Instruments (hereinafter: ZTHV. Due to the above the ZTHV's provisions are compared to the new solutions within the domain of regular periodical disclosure of the Capital Market Law.

  14. Protein domain recurrence and order can enhance prediction of protein functions

    KAUST Repository

    Abdel Messih, Mario A.

    2012-09-07

    Motivation: Burgeoning sequencing technologies have generated massive amounts of genomic and proteomic data. Annotating the functions of proteins identified in this data has become a big and crucial problem. Various computational methods have been developed to infer the protein functions based on either the sequences or domains of proteins. The existing methods, however, ignore the recurrence and the order of the protein domains in this function inference. Results: We developed two new methods to infer protein functions based on protein domain recurrence and domain order. Our first method, DRDO, calculates the posterior probability of the Gene Ontology terms based on domain recurrence and domain order information, whereas our second method, DRDO-NB, relies on the nave Bayes methodology using the same domain architecture information. Our large-scale benchmark comparisons show strong improvements in the accuracy of the protein function inference achieved by our new methods, demonstrating that domain recurrence and order can provide important information for inference of protein functions. The Author(s) 2012. Published by Oxford University Press.

  15. A time domain phase-gradient based ISAR autofocus algorithm

    CSIR Research Space (South Africa)

    Nel, W

    2011-10-01

    Full Text Available . Results on simulated and measured data show that the algorithm performs well. Unlike many other ISAR autofocus techniques, the algorithm does not make use of several computationally intensive iterations between the data and image domains as part...

  16. Domain-growth kinetics and aspects of pinning: A Monte Carlo simulation study

    DEFF Research Database (Denmark)

    Castán, T.; Lindgård, Per-Anker

    1991-01-01

    By means of Monte Carlo computer simulations we study the domain-growth kinetics after a quench across a first-order line to very low and moderate temperatures in a multidegenerate system with nonconserved order parameter. The model is a continuous spin model relevant for martensitic transformati......By means of Monte Carlo computer simulations we study the domain-growth kinetics after a quench across a first-order line to very low and moderate temperatures in a multidegenerate system with nonconserved order parameter. The model is a continuous spin model relevant for martensitic...... to cross over from n = 1/4 at T approximately 0 to n = 1/2 with temperature for models with pinnings of types (a) and (b). For topological pinnings at T approximately 0, n is consistent with n = 1/8, a value conceivable for several levels of hierarchically interrelated domain-wall movement. When...

  17. A novel algorithm for fractional resizing of digital image in DCT domain

    Institute of Scientific and Technical Information of China (English)

    Wang Ci; Zhang Wenjun; Zheng Meng

    2005-01-01

    Fractional resizing of digital images is needed in various applications, such as displaying at different resolution depending on that of display device, building image index for an image database, and changing resolution according to the transmission channel bandwidth. With the wide use of JPEG and MPEG, almost all digital images are stored and transferred in DCT compressed format. Inorder to save the computation and memory cost, it is desirable to do resizing in DCT domain directly. This paper presents a fast and efficient method, which possesses the capability of fractional resizing in DCT domain. Experimental results confirm that this scheme can achieve significant computation cost reduction while maintain better quality.

  18. Ontology Design for Solving Computationally-Intensive Problems on Heterogeneous Architectures

    Directory of Open Access Journals (Sweden)

    Hossam M. Faheem

    2018-02-01

    Full Text Available Viewing a computationally-intensive problem as a self-contained challenge with its own hardware, software and scheduling strategies is an approach that should be investigated. We might suggest assigning heterogeneous hardware architectures to solve a problem, while parallel computing paradigms may play an important role in writing efficient code to solve the problem; moreover, the scheduling strategies may be examined as a possible solution. Depending on the problem complexity, finding the best possible solution using an integrated infrastructure of hardware, software and scheduling strategy can be a complex job. Developing and using ontologies and reasoning techniques play a significant role in reducing the complexity of identifying the components of such integrated infrastructures. Undertaking reasoning and inferencing regarding the domain concepts can help to find the best possible solution through a combination of hardware, software and scheduling strategies. In this paper, we present an ontology and show how we can use it to solve computationally-intensive problems from various domains. As a potential use for the idea, we present examples from the bioinformatics domain. Validation by using problems from the Elastic Optical Network domain has demonstrated the flexibility of the suggested ontology and its suitability for use with any other computationally-intensive problem domain.

  19. The national public's values and interests related to the Arctic National Wildlife Refuge: A computer content analysis

    Science.gov (United States)

    David N. Bengston; David P. Fan; Roger Kaye

    2010-01-01

    This study examined the national public's values and interests related to the Arctic National Wildlife Refuge. Computer content analysis was used to analyze more than 23,000 media stories about the refuge from 1995 through 2007. Ten main categories of Arctic National Wildlife Refuge values and interests emerged from the analysis, reflecting a diversity of values,...

  20. A computational approach identifies two regions of Hepatitis C Virus E1 protein as interacting domains involved in viral fusion process

    Directory of Open Access Journals (Sweden)

    El Sawaf Gamal

    2009-07-01

    Full Text Available Abstract Background The E1 protein of Hepatitis C Virus (HCV can be dissected into two distinct hydrophobic regions: a central domain containing an hypothetical fusion peptide (FP, and a C-terminal domain (CT comprising two segments, a pre-anchor and a trans-membrane (TM region. In the currently accepted model of the viral fusion process, the FP and the TM regions are considered to be closely juxtaposed in the post-fusion structure and their physical interaction cannot be excluded. In the present study, we took advantage of the natural sequence variability present among HCV strains to test, by purely sequence-based computational tools, the hypothesis that in this virus the fusion process involves the physical interaction of the FP and CT regions of E1. Results Two computational approaches were applied. The first one is based on the co-evolution paradigm of interacting peptides and consequently on the correlation between the distance matrices generated by the sequence alignment method applied to FP and CT primary structures, respectively. In spite of the relatively low random genetic drift between genotypes, co-evolution analysis of sequences from five HCV genotypes revealed a greater correlation between the FP and CT domains than respect to a control HCV sequence from Core protein, so giving a clear, albeit still inconclusive, support to the physical interaction hypothesis. The second approach relies upon a non-linear signal analysis method widely used in protein science called Recurrence Quantification Analysis (RQA. This method allows for a direct comparison of domains for the presence of common hydrophobicity patterns, on which the physical interaction is based upon. RQA greatly strengthened the reliability of the hypothesis by the scoring of a lot of cross-recurrences between FP and CT peptides hydrophobicity patterning largely outnumbering chance expectations and pointing to putative interaction sites. Intriguingly, mutations in the CT

  1. Ethics, big data and computing in epidemiology and public health.

    Science.gov (United States)

    Salerno, Jennifer; Knoppers, Bartha M; Lee, Lisa M; Hlaing, WayWay M; Goodman, Kenneth W

    2017-05-01

    This article reflects on the activities of the Ethics Committee of the American College of Epidemiology (ACE). Members of the Ethics Committee identified an opportunity to elaborate on knowledge gained since the inception of the original Ethics Guidelines published by the ACE Ethics and Standards of Practice Committee in 2000. The ACE Ethics Committee presented a symposium session at the 2016 Epidemiology Congress of the Americas in Miami on the evolving complexities of ethics and epidemiology as it pertains to "big data." This article presents a summary and further discussion of that symposium session. Three topic areas were presented: the policy implications of big data and computing, the fallacy of "secondary" data sources, and the duty of citizens to contribute to big data. A balanced perspective is needed that provides safeguards for individuals but also furthers research to improve population health. Our in-depth review offers next steps for teaching of ethics and epidemiology, as well as for epidemiological research, public health practice, and health policy. To address contemporary topics in the area of ethics and epidemiology, the Ethics Committee hosted a symposium session on the timely topic of big data. Technological advancements in clinical medicine and genetic epidemiology research coupled with rapid advancements in data networks, storage, and computation at a lower cost are resulting in the growth of huge data repositories. Big data increases concerns about data integrity; informed consent; protection of individual privacy, confidentiality, and harm; data reidentification; and the reporting of faulty inferences. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Mobile cloud computing for computation offloading: Issues and challenges

    Directory of Open Access Journals (Sweden)

    Khadija Akherfi

    2018-01-01

    Full Text Available Despite the evolution and enhancements that mobile devices have experienced, they are still considered as limited computing devices. Today, users become more demanding and expect to execute computational intensive applications on their smartphone devices. Therefore, Mobile Cloud Computing (MCC integrates mobile computing and Cloud Computing (CC in order to extend capabilities of mobile devices using offloading techniques. Computation offloading tackles limitations of Smart Mobile Devices (SMDs such as limited battery lifetime, limited processing capabilities, and limited storage capacity by offloading the execution and workload to other rich systems with better performance and resources. This paper presents the current offloading frameworks, computation offloading techniques, and analyzes them along with their main critical issues. In addition, it explores different important parameters based on which the frameworks are implemented such as offloading method and level of partitioning. Finally, it summarizes the issues in offloading frameworks in the MCC domain that requires further research.

  3. PREFACE: Domain wall dynamics in nanostructures Domain wall dynamics in nanostructures

    Science.gov (United States)

    Marrows, C. H.; Meier, G.

    2012-01-01

    Domain structures in magnetic materials are ubiquitous and have been studied for decades. The walls that separate them are topological defects in the magnetic order parameter and have a wide variety of complex forms. In general, their investigation is difficult in bulk materials since only the domain structure on the surface of a specimen is visible. Cutting the sample to reveal the interior causes a rearrangement of the domains into a new form. As with many other areas of magnetism, the study of domain wall physics has been revitalised by the advent of nanotechnology. The ability to fabricate nanoscale structures has permitted the formation of simplified and controlled domain patterns; the development of advanced microscopy methods has permitted them to be imaged and then modelled; subjecting them to ultrashort field and current pulses has permitted their dynamics to be explored. The latest results from all of these advances are described in this special issue. Not only has this led to results of great scientific beauty, but also to concepts of great applicability to future information technologies. In this issue the reader will find the latest results for these domain wall dynamics and the high-speed processes of topological structures such as domain walls and magnetic vortices. These dynamics can be driven by the application of magnetic fields, or by flowing currents through spintronic devices using the novel physics of spin-transfer torque. This complexity has been studied using a wide variety of experimental techniques at the edge of the spatial and temporal resolution currently available, and can be described using sophisticated analytical theory and computational modelling. As a result, the dynamics can be engineered to give rise to finely controlled memory and logic devices with new functionality. Moreover, the field is moving to study not only the conventional transition metal ferromagnets, but also complex heterostructures, novel magnets and even other

  4. International Developments in Computer Science.

    Science.gov (United States)

    1982-06-01

    background on 52 53 China’s scientific research and on their computer science before 1978. A useful companion to the directory is another publication of the...bimonthly publication in Portuguese; occasional translation of foreign articles into Portuguese. Data News: A bimonthly industry newsletter. Sistemas ...computer-related topics; Spanish. Delta: Publication of local users group; Spanish. Sistemas : Publication of System Engineers of Colombia; Spanish. CUBA

  5. Health domains for sale: the need for global health Internet governance.

    Science.gov (United States)

    Mackey, Tim Ken; Liang, Bryan A; Kohler, Jillian C; Attaran, Amir

    2014-03-05

    A debate on Internet governance for health, or "eHealth governance", is emerging with the impending award of a new dot-health (.health) generic top-level domain name (gTLD) along with a host of other health-related domains. This development is critical as it will shape the future of the health Internet, allowing largely unrestricted use of .health second-level domain names by future registrants, raising concerns about the potential for privacy, use and marketing of health-related information, credibility of online health content, and potential for Internet fraud and abuse. Yet, prospective .health gTLD applicants do not provide adequate safeguards for use of .health or related domains and have few or no ties to the global health community. If approved, one of these for-profit corporate applicants would effectively control the future of the .health address on the Internet with arguably no active oversight from important international public health stakeholders. This would represent a lost opportunity for the public health, medical, and broader health community in establishing a trusted, transparent and reliable source for health on the Internet. Countries, medical associations, civil society, and consumer advocates have objected to these applications on grounds that they do not meet the public interest. We argue that there is an immediate need for action to postpone awarding of the .health gTLD and other health-related gTLDs to address these concerns and ensure the appropriate development of sound eHealth governance rules, principles, and use. This would support the crucial need of ensuring access to quality and evidence-based sources of health information online, as well as establishing a safe and reliable space on the Internet for health. We believe, if properly governed, .health and other domains could represent such a promise in the future.

  6. A domain sequence approach to pangenomics: applications to Escherichia coli [v2; ref status: indexed, http://f1000r.es/ul

    Directory of Open Access Journals (Sweden)

    Lars-Gustav Snipen

    2013-05-01

    Full Text Available The study of microbial pangenomes relies on the computation of gene families, i.e. the clustering of coding sequences into groups of essentially similar genes. There is no standard approach to obtain such gene families. Ideally, the gene family computations should be robust against errors in the annotation of genes in various genomes. In an attempt to achieve this robustness, we propose to cluster sequences by their domain sequence, i.e. the ordered sequence of domains in their protein sequence. In a study of 347 genomes from Escherichia coli we find on average around 4500 proteins having hits in Pfam-A in every genome, clustering into around 2500 distinct domain sequence families in each genome. Across all genomes we find a total of 5724 such families. A binomial mixture model approach indicates this is around 95% of all domain sequences we would expect to see in E. coli in the future. A Heaps law analysis indicates the population of domain sequences is larger, but this analysis is also very sensitive to smaller changes in the computation procedure. The resolution between strains is good despite the coarse grouping obtained by domain sequence families. Clustering sequences by their ordered domain content give us domain sequence families, who are robust to errors in the gene prediction step. The computational load of the procedure scales linearly with the number of genomes, which is needed for the future explosion in the number of re-sequenced strains. The use of domain sequence families for a functional classification of strains clearly has some potential to be explored.

  7. Domain Specific Language Support for Exascale

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2017-10-20

    A multi-institutional project known as D-TEC (short for “Domain- specific Technology for Exascale Computing”) set out to explore technologies to support the construction of Domain Specific Languages (DSLs) to map application programs to exascale architectures. DSLs employ automated code transformation to shift the burden of delivering portable performance from application programmers to compilers. Two chief properties contribute: DSLs permit expression at a high level of abstraction so that a programmer’s intent is clear to a compiler and DSL implementations encapsulate human domain-specific optimization knowledge so that a compiler can be smart enough to achieve good results on specific hardware. Domain specificity is what makes these properties possible in a programming language. If leveraging domain specificity is the key to keep exascale software tractable, a corollary is that many different DSLs will be needed to encompass the full range of exascale computing applications; moreover, a single application may well need to use several different DSLs in conjunction. As a result, developing a general toolkit for building domain-specific languages was a key goal for the D-TEC project. Different aspects of the D-TEC research portfolio were the focus of work at each of the partner institutions in the multi-institutional project. D-TEC research and development work at Rice University focused on on three principal topics: understanding how to automate the tuning of code for complex architectures, research and development of the Rosebud DSL engine, and compiler technology to support complex execution platforms. This report provides a summary of the research and development work on the D-TEC project at Rice University.

  8. History Matching in Parallel Computational Environments

    Energy Technology Data Exchange (ETDEWEB)

    Steven Bryant; Sanjay Srinivasan; Alvaro Barrera; Sharad Yadav

    2005-10-01

    A novel methodology for delineating multiple reservoir domains for the purpose of history matching in a distributed computing environment has been proposed. A fully probabilistic approach to perturb permeability within the delineated zones is implemented. The combination of robust schemes for identifying reservoir zones and distributed computing significantly increase the accuracy and efficiency of the probabilistic approach. The information pertaining to the permeability variations in the reservoir that is contained in dynamic data is calibrated in terms of a deformation parameter rD. This information is merged with the prior geologic information in order to generate permeability models consistent with the observed dynamic data as well as the prior geology. The relationship between dynamic response data and reservoir attributes may vary in different regions of the reservoir due to spatial variations in reservoir attributes, well configuration, flow constrains etc. The probabilistic approach then has to account for multiple r{sub D} values in different regions of the reservoir. In order to delineate reservoir domains that can be characterized with different rD parameters, principal component analysis (PCA) of the Hessian matrix has been done. The Hessian matrix summarizes the sensitivity of the objective function at a given step of the history matching to model parameters. It also measures the interaction of the parameters in affecting the objective function. The basic premise of PC analysis is to isolate the most sensitive and least correlated regions. The eigenvectors obtained during the PCA are suitably scaled and appropriate grid block volume cut-offs are defined such that the resultant domains are neither too large (which increases interactions between domains) nor too small (implying ineffective history matching). The delineation of domains requires calculation of Hessian, which could be computationally costly and as well as restricts the current approach to

  9. Evaluation of need for ontologies to manage domain content for the Reportable Conditions Knowledge Management System.

    Science.gov (United States)

    Eilbeck, Karen L; Lipstein, Julie; McGarvey, Sunanda; Staes, Catherine J

    2014-01-01

    The Reportable Condition Knowledge Management System (RCKMS) is envisioned to be a single, comprehensive, authoritative, real-time portal to author, view and access computable information about reportable conditions. The system is designed for use by hospitals, laboratories, health information exchanges, and providers to meet public health reporting requirements. The RCKMS Knowledge Representation Workgroup was tasked to explore the need for ontologies to support RCKMS functionality. The workgroup reviewed relevant projects and defined criteria to evaluate candidate knowledge domain areas for ontology development. The use of ontologies is justified for this project to unify the semantics used to describe similar reportable events and concepts between different jurisdictions and over time, to aid data integration, and to manage large, unwieldy datasets that evolve, and are sometimes externally managed.

  10. Brazilian adults' sedentary behaviors by life domain: population-based study.

    Science.gov (United States)

    Mielke, Grégore I; da Silva, Inácio C M; Owen, Neville; Hallal, Pedro C

    2014-01-01

    There is rapidly-emerging evidence on the harmful health effects of sedentary behaviors. The aim of this paper was to quantify time in sedentary behaviors and document socio-demographic variations in different life domains among adults. A population-based survey was carried out in 2012 through face-to-face interviews with Brazilian adults aged 20+ years (N = 2,927). Information about time spent sedentary in a typical weekday was collected for five different domains (workplace, commuting, school/university, watching TV, and computer use at home). Descriptive and bivariate analyses examined variations in overall and domain-specific sedentary time by gender, age, educational attainment and socioeconomic position. On average, participants reported spending 5.8 (SD 4.5) hours per day sitting. The median value was 4.5 (interquartile range: 2.5-8) hours. Men, younger adults, those with higher schooling and from the wealthiest socioeconomic groups had higher overall sedentary scores. TV time was higher in women, older adults and among those with low schooling and socioeconomic position. Sedentary time in transport was higher in men, younger adults, and participants with high schooling and high socioeconomic position. Computer use at home was more frequent among young adults and those from high socioeconomic groups. Sitting at work was higher in those with higher schooling and from the wealthiest socioeconomic groups. Sedentary behavior at school was related inversely to age and directly to schooling. Patterns of sedentary behavior are different by life domains. Initiatives to reduce prolonged sitting among Brazilian adults will be required on multiple levels for different life domains.

  11. Brazilian adults' sedentary behaviors by life domain: population-based study.

    Directory of Open Access Journals (Sweden)

    Grégore I Mielke

    Full Text Available There is rapidly-emerging evidence on the harmful health effects of sedentary behaviors. The aim of this paper was to quantify time in sedentary behaviors and document socio-demographic variations in different life domains among adults.A population-based survey was carried out in 2012 through face-to-face interviews with Brazilian adults aged 20+ years (N = 2,927. Information about time spent sedentary in a typical weekday was collected for five different domains (workplace, commuting, school/university, watching TV, and computer use at home. Descriptive and bivariate analyses examined variations in overall and domain-specific sedentary time by gender, age, educational attainment and socioeconomic position.On average, participants reported spending 5.8 (SD 4.5 hours per day sitting. The median value was 4.5 (interquartile range: 2.5-8 hours. Men, younger adults, those with higher schooling and from the wealthiest socioeconomic groups had higher overall sedentary scores. TV time was higher in women, older adults and among those with low schooling and socioeconomic position. Sedentary time in transport was higher in men, younger adults, and participants with high schooling and high socioeconomic position. Computer use at home was more frequent among young adults and those from high socioeconomic groups. Sitting at work was higher in those with higher schooling and from the wealthiest socioeconomic groups. Sedentary behavior at school was related inversely to age and directly to schooling.Patterns of sedentary behavior are different by life domains. Initiatives to reduce prolonged sitting among Brazilian adults will be required on multiple levels for different life domains.

  12. Domains and domain loss

    DEFF Research Database (Denmark)

    Haberland, Hartmut

    2005-01-01

    politicians and in the media, especially in the discussion whether some languages undergo ‘domain loss’ vis-à-vis powerful international languages like English. An objection that has been raised here is that domains, as originally conceived, are parameters of language choice and not properties of languages...

  13. Improving the performance of DomainDiscovery of protein domain boundary assignment using inter-domain linker index

    Directory of Open Access Journals (Sweden)

    Zomaya Albert Y

    2006-12-01

    Full Text Available Abstract Background Knowledge of protein domain boundaries is critical for the characterisation and understanding of protein function. The ability to identify domains without the knowledge of the structure – by using sequence information only – is an essential step in many types of protein analyses. In this present study, we demonstrate that the performance of DomainDiscovery is improved significantly by including the inter-domain linker index value for domain identification from sequence-based information. Improved DomainDiscovery uses a Support Vector Machine (SVM approach and a unique training dataset built on the principle of consensus among experts in defining domains in protein structure. The SVM was trained using a PSSM (Position Specific Scoring Matrix, secondary structure, solvent accessibility information and inter-domain linker index to detect possible domain boundaries for a target sequence. Results Improved DomainDiscovery is compared with other methods by benchmarking against a structurally non-redundant dataset and also CASP5 targets. Improved DomainDiscovery achieves 70% accuracy for domain boundary identification in multi-domains proteins. Conclusion Improved DomainDiscovery compares favourably to the performance of other methods and excels in the identification of domain boundaries for multi-domain proteins as a result of introducing support vector machine with benchmark_2 dataset.

  14. Parallel computers and three-dimensional computational electromagnetics

    International Nuclear Information System (INIS)

    Madsen, N.K.

    1994-01-01

    The authors have continued to enhance their ability to use new massively parallel processing computers to solve time-domain electromagnetic problems. New vectorization techniques have improved the performance of their code DSI3D by factors of 5 to 15, depending on the computer used. New radiation boundary conditions and far-field transformations now allow the computation of radar cross-section values for complex objects. A new parallel-data extraction code has been developed that allows the extraction of data subsets from large problems, which have been run on parallel computers, for subsequent post-processing on workstations with enhanced graphics capabilities. A new charged-particle-pushing version of DSI3D is under development. Finally, DSI3D has become a focal point for several new Cooperative Research and Development Agreement activities with industrial companies such as Lockheed Advanced Development Company, Varian, Hughes Electron Dynamics Division, General Atomic, and Cray

  15. A multi-domain Chebyshev collocation method for predicting ultrasonic field parameters in complex material geometries

    DEFF Research Database (Denmark)

    Nielsen, S.A.; Hesthaven, J.S.

    2002-01-01

    elastodynamic formulation, giving a direct solution of the time-domain elastodynamic equations. A typical calculation is performed by decomposing the global computational domain into a number of subdomains. Every subdomain is then mapped on a unit square using transfinite blending functions and spatial...

  16. Columbia Public Health Core Curriculum: Short-Term Impact.

    Science.gov (United States)

    Begg, Melissa D; Fried, Linda P; Glover, Jim W; Delva, Marlyn; Wiggin, Maggie; Hooper, Leah; Saxena, Roheeni; de Pinho, Helen; Slomin, Emily; Walker, Julia R; Galea, Sandro

    2015-12-01

    We evaluated a transformed core curriculum for the Columbia University, Mailman School of Public Health (New York, New York) master of public health (MPH) degree. The curriculum, launched in 2012, aims to teach public health as it is practiced: in interdisciplinary teams, drawing on expertise from multiple domains to address complex health challenges. We collected evaluation data starting when the first class of students entered the program and ending with their graduation in May 2014. Students reported being very satisfied with and challenged by the rigorous curriculum and felt prepared to integrate concepts across varied domains and disciplines to solve public health problems. This novel interdisciplinary program could serve as a prototype for other schools that wish to reinvigorate MPH training.

  17. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  18. Gapped fermionic spectrum from a domain wall in seven dimension

    Science.gov (United States)

    Mukhopadhyay, Subir; Rai, Nishal

    2018-05-01

    We obtain a domain wall solution in maximally gauged seven dimensional supergravity, which interpolates between two AdS spaces and spontaneously breaks a U (1) symmetry. We analyse frequency dependence of conductivity and find power law behaviour at low frequency. We consider certain fermions of supergravity in the background of this domain wall and compute holographic spectral function of the operators in the dual six dimensional theory. We find fermionic operators involving bosons with non-zero expectation value lead to gapped spectrum.

  19. Finite difference time domain modeling of spiral antennas

    Science.gov (United States)

    Penney, Christopher W.; Beggs, John H.; Luebbers, Raymond J.

    1992-01-01

    The objectives outlined in the original proposal for this project were to create a well-documented computer analysis model based on the finite-difference, time-domain (FDTD) method that would be capable of computing antenna impedance, far-zone radiation patterns, and radar cross-section (RCS). The ability to model a variety of penetrable materials in addition to conductors is also desired. The spiral antennas under study by this project meet these requirements since they are constructed of slots cut into conducting surfaces which are backed by dielectric materials.

  20. RAPPORT: running scientific high-performance computing applications on the cloud.

    Science.gov (United States)

    Cohen, Jeremy; Filippis, Ioannis; Woodbridge, Mark; Bauer, Daniela; Hong, Neil Chue; Jackson, Mike; Butcher, Sarah; Colling, David; Darlington, John; Fuchs, Brian; Harvey, Matt

    2013-01-28

    Cloud computing infrastructure is now widely used in many domains, but one area where there has been more limited adoption is research computing, in particular for running scientific high-performance computing (HPC) software. The Robust Application Porting for HPC in the Cloud (RAPPORT) project took advantage of existing links between computing researchers and application scientists in the fields of bioinformatics, high-energy physics (HEP) and digital humanities, to investigate running a set of scientific HPC applications from these domains on cloud infrastructure. In this paper, we focus on the bioinformatics and HEP domains, describing the applications and target cloud platforms. We conclude that, while there are many factors that need consideration, there is no fundamental impediment to the use of cloud infrastructure for running many types of HPC applications and, in some cases, there is potential for researchers to benefit significantly from the flexibility offered by cloud platforms.

  1. 2017 Emerging Technology Domains Risk Survey

    Science.gov (United States)

    2017-10-01

    REV-03.18.2016.0 2017 Emerging Technology Domains Risk Survey Daniel Klinedinst Joel Land Kyle O’Meara October 2017 TECHNICAL REPORT CMU/SEI...Distribution Statement A: Approved for Public Release. Distribution is Unlimited. List of Tables Table 1: New and Emerging Technologies 2 Table 2: Security...Impact of New and Emerging Technologies 4 Table 3: Severity Classifications and Impact Scores 5 CMU/SEI-2017-TR-008 | SOFTWARE ENGINEERING

  2. 2016 Emerging Technology Domains Risk Survey

    Science.gov (United States)

    2016-04-05

    measures upon which the CERT/CC based its recommendations and how each domain was triaged for importance. 6. Exploitation Examples details concepts or...Distribution Statement A: Approved for Public Release; Distribution is Unlimited 2 Methodology A measured approach to analysis is required when...only a few vehicles had access to a cellular Internet connection, and only at 3G speeds. Some vehicles already have LTE connections, and many

  3. Mechanical and assembly units of viral capsids identified via quasi-rigid domain decomposition.

    Directory of Open Access Journals (Sweden)

    Guido Polles

    Full Text Available Key steps in a viral life-cycle, such as self-assembly of a protective protein container or in some cases also subsequent maturation events, are governed by the interplay of physico-chemical mechanisms involving various spatial and temporal scales. These salient aspects of a viral life cycle are hence well described and rationalised from a mesoscopic perspective. Accordingly, various experimental and computational efforts have been directed towards identifying the fundamental building blocks that are instrumental for the mechanical response, or constitute the assembly units, of a few specific viral shells. Motivated by these earlier studies we introduce and apply a general and efficient computational scheme for identifying the stable domains of a given viral capsid. The method is based on elastic network models and quasi-rigid domain decomposition. It is first applied to a heterogeneous set of well-characterized viruses (CCMV, MS2, STNV, STMV for which the known mechanical or assembly domains are correctly identified. The validated method is next applied to other viral particles such as L-A, Pariacoto and polyoma viruses, whose fundamental functional domains are still unknown or debated and for which we formulate verifiable predictions. The numerical code implementing the domain decomposition strategy is made freely available.

  4. Time-domain numerical computations of electromagnetic fields in cylindrical co-ordinates using the transmission line matrix: evaluation of radiaion losses from a charge bunch passing through a pill-box resonator

    International Nuclear Information System (INIS)

    Sarma, J.; Robson, P.N.

    1979-01-01

    The two dimensional transmission line matrix (TLM) numerical method has been adapted to compute electromagnetic field distributions in cylindrical co-ordinates and it is applied to evaluate the radiation loss from a charge bunch passing through a 'pill-box' resonator. The computer program has been developed to calculate not only the total energy loss to the resonator but also that component of it which exists in the TM 010 mode. The numerically computed results are shown to agree very well with the analytically derived values as found in the literature which, therefore, established the degree of accuracy that is obtained with the TLM method. The particular features of computational simplicity, numerical stability and the inherently time-domain solutions produced by the TLM method are cited as additional, attractive reasons for using this numerical procedure in solving such problems. (Auth.)

  5. Using Sentence-Level Classifiers for Cross-Domain Sentiment Analysis

    Science.gov (United States)

    2014-09-01

    National Defence, 2014 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la Défense nationale, 2014 DRDC-RDDC...domain sentiment classification via spectral feature alignment. In Proceedings of the 19th international conference on World Wide Web, WWW ’10...Dennis, S. 5. DATE OF PUBLICATION (Month and year of publication of document.) September 2014 6a. NO. OF PAGES (Total containing information

  6. Perfectly matched layer method in the finite-difference time-domain and frequency-domain calculations

    DEFF Research Database (Denmark)

    Shyroki, Dzmitry; Lavrinenko, Andrei

    2007-01-01

    A complex-coordinate method known under the guise of the perfectly matched layer (PML) method for treating unbounded domains in computational electrodynamics is related to similar techniques in fluid dynamics and classical quantum theory. It may also find use in electronic-structure finite......-difference simulations. Straightforward transfer of the PML formulation to other fields does not seem feasible, however, since it is a unique feature of electrodynamics - the natural invariance - that allows analytic trick of complex coordinate scaling to be represented as pure modification of local material parameters...

  7. National cyber defense high performance computing and analysis : concepts, planning and roadmap.

    Energy Technology Data Exchange (ETDEWEB)

    Hamlet, Jason R.; Keliiaa, Curtis M.

    2010-09-01

    There is a national cyber dilemma that threatens the very fabric of government, commercial and private use operations worldwide. Much is written about 'what' the problem is, and though the basis for this paper is an assessment of the problem space, we target the 'how' solution space of the wide-area national information infrastructure through the advancement of science, technology, evaluation and analysis with actionable results intended to produce a more secure national information infrastructure and a comprehensive national cyber defense capability. This cybersecurity High Performance Computing (HPC) analysis concepts, planning and roadmap activity was conducted as an assessment of cybersecurity analysis as a fertile area of research and investment for high value cybersecurity wide-area solutions. This report and a related SAND2010-4765 Assessment of Current Cybersecurity Practices in the Public Domain: Cyber Indications and Warnings Domain report are intended to provoke discussion throughout a broad audience about developing a cohesive HPC centric solution to wide-area cybersecurity problems.

  8. Computational simulation in architectural and environmental acoustics methods and applications of wave-based computation

    CERN Document Server

    Sakamoto, Shinichi; Otsuru, Toru

    2014-01-01

    This book reviews a variety of methods for wave-based acoustic simulation and recent applications to architectural and environmental acoustic problems. Following an introduction providing an overview of computational simulation of sound environment, the book is in two parts: four chapters on methods and four chapters on applications. The first part explains the fundamentals and advanced techniques for three popular methods, namely, the finite-difference time-domain method, the finite element method, and the boundary element method, as well as alternative time-domain methods. The second part demonstrates various applications to room acoustics simulation, noise propagation simulation, acoustic property simulation for building components, and auralization. This book is a valuable reference that covers the state of the art in computational simulation for architectural and environmental acoustics.  

  9. Reclaiming public space: designing for public interaction with private devices

    DEFF Research Database (Denmark)

    Eriksson, Eva; Hansen, Thomas Riisgaard; Lykke-Olesen, Andreas

    2007-01-01

    . In this paper we explore the implications of interacting in public space and how technology can be rethought to not only act as personal devices, but be the tool to reclaim the right and possibility to interact in public spaces. We introduce information exchange, social support and regulation as three central......Public spaces are changing from being ungoverned places for interaction to be more formalized, controlled, less interactive, and designed places aimed at fulfilling a purpose. Simultaneously, new personal mobile technology aims at providing private individual spaces in the public domain...... aspects for reclaiming public space. The PhotoSwapper application is presented as an example of a system designed to integrate pervasive technology in a public setting. The system is strongly inspired by the activities at a traditional market place. Based on the design of the application we discuss four...

  10. A pseudospectral collocation time-domain method for diffractive optics

    DEFF Research Database (Denmark)

    Dinesen, P.G.; Hesthaven, J.S.; Lynov, Jens-Peter

    2000-01-01

    We present a pseudospectral method for the analysis of diffractive optical elements. The method computes a direct time-domain solution of Maxwell's equations and is applied to solving wave propagation in 2D diffractive optical elements. (C) 2000 IMACS. Published by Elsevier Science B.V. All rights...

  11. Frequency-domain imaging algorithm for ultrasonic testing by application of matrix phased arrays

    Directory of Open Access Journals (Sweden)

    Dolmatov Dmitry

    2017-01-01

    Full Text Available Constantly increasing demand for high-performance materials and systems in aerospace industry requires advanced methods of nondestructive testing. One of the most promising methods is ultrasonic imaging by using matrix phased arrays. This technique allows to create three-dimensional ultrasonic imaging with high lateral resolution. Further progress in matrix phased array ultrasonic testing is determined by the development of fast imaging algorithms. In this article imaging algorithm based on frequency domain calculations is proposed. This approach is computationally efficient in comparison with time domain algorithms. Performance of the proposed algorithm was tested via computer simulations for planar specimen with flat bottom holes.

  12. Computer vision and machine learning for archaeology

    NARCIS (Netherlands)

    van der Maaten, L.J.P.; Boon, P.; Lange, G.; Paijmans, J.J.; Postma, E.

    2006-01-01

    Until now, computer vision and machine learning techniques barely contributed to the archaeological domain. The use of these techniques can support archaeologists in their assessment and classification of archaeological finds. The paper illustrates the use of computer vision techniques for

  13. Digital optical computers at the optoelectronic computing systems center

    Science.gov (United States)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  14. Models for randomly distributed nanoscopic domains on spherical vesicles

    Science.gov (United States)

    Anghel, Vinicius N. P.; Bolmatov, Dima; Katsaras, John

    2018-06-01

    The existence of lipid domains in the plasma membrane of biological systems has proven controversial, primarily due to their nanoscopic size—a length scale difficult to interrogate with most commonly used experimental techniques. Scattering techniques have recently proven capable of studying nanoscopic lipid domains populating spherical vesicles. However, the development of analytical methods able of predicting and analyzing domain pair correlations from such experiments has not kept pace. Here, we developed models for the random distribution of monodisperse, circular nanoscopic domains averaged on the surface of a spherical vesicle. Specifically, the models take into account (i) intradomain correlations corresponding to form factors and interdomain correlations corresponding to pair distribution functions, and (ii) the analytical computation of interdomain correlations for cases of two and three domains on a spherical vesicle. In the case of more than three domains, these correlations are treated either by Monte Carlo simulations or by spherical analogs of the Ornstein-Zernike and Percus-Yevick (PY) equations. Importantly, the spherical analog of the PY equation works best in the case of nanoscopic size domains, a length scale that is mostly inaccessible by experimental approaches such as, for example, fluorescent techniques and optical microscopies. The analytical form factors and structure factors of nanoscopic domains populating a spherical vesicle provide a new and important framework for the quantitative analysis of experimental data from commonly studied phase-separated vesicles used in a wide range of biophysical studies.

  15. Depiction of global trends in publications on mobile health

    Directory of Open Access Journals (Sweden)

    Shahla Foozonkhah

    2017-07-01

    Full Text Available Background: Variety of mobile health initiatives in different levels have been undertaken across many countries. Trends of these initiatives can be reflected in the research published in m-health domain. Aim: This paper aims to depict global trends in the published works on m-health topic. Materials and Methods: The Web of Science database was used to identify all relevant published papers on mobile health domain worldwide. The search was conducted on documents published from January 1898 to December 2014. The criteria for searching were set to be “mHealth” or “Mobile health” or “m health” or “m_health” or “m-health” in topics. Results: Findings revealed an increasing trend of citations and publications on m-health research since 2012. English was the first most predominant language of the publication. The US had the highest number of publication with 649 papers; however, the Netherlands ranked first after considering publication number in terms of countries population. “Studies in Health Technology and Informatics” was the source title with highest number of publications on mobile health topics. Conclusion: Trend of research observed in this study indicates the continuing growth is happening in mobile health domain. This may imply that the new model of health-care delivery is emerging. Further research is needed to specify directions of mobile health research. It is necessary to identify and prioritize the research gaps in this domain.

  16. Computational Acoustics: Computational PDEs, Pseudodifferential Equations, Path Integrals, and All That Jazz

    Science.gov (United States)

    Fishman, Louis

    2000-11-01

    The role of mathematical modeling in the physical sciences will be briefly addressed. Examples will focus on computational acoustics, with applications to underwater sound propagation, electromagnetic modeling, optics, and seismic inversion. Direct and inverse wave propagation problems in both the time and frequency domains will be considered. Focusing on fixed-frequency (elliptic) wave propagation problems, the usual, two-way, partial differential equation formulation will be exactly reformulated, in a well-posed manner, as a one-way (marching) problem. This is advantageous for both direct and inverse considerations, as well as stochastic modeling problems. The reformulation will require the introduction of pseudodifferential operators and their accompanying phase space analysis (calculus), in addition to path integral representations for the fundamental solutions and their subsequent computational algorithms. Unlike the more traditional, purely numerical applications of, for example, finite-difference and finite-element methods, this approach, in effect, writes the exact, or, more generally, the asymptotically correct, answer as a functional integral and, subsequently, computes it directly. The overall computational philosophy is to combine analysis, asymptotics, and numerical methods to attack complicated, real-world problems. Exact and asymptotic analysis will stress the complementary nature of the direct and inverse formulations, as well as indicating the explicit structural connections between the time- and frequency-domain solutions.

  17. Radiative transport-based frequency-domain fluorescence tomography

    International Nuclear Information System (INIS)

    Joshi, Amit; Rasmussen, John C; Sevick-Muraca, Eva M; Wareing, Todd A; McGhee, John

    2008-01-01

    We report the development of radiative transport model-based fluorescence optical tomography from frequency-domain boundary measurements. The coupled radiative transport model for describing NIR fluorescence propagation in tissue is solved by a novel software based on the established Attila(TM) particle transport simulation platform. The proposed scheme enables the prediction of fluorescence measurements with non-contact sources and detectors at a minimal computational cost. An adjoint transport solution-based fluorescence tomography algorithm is implemented on dual grids to efficiently assemble the measurement sensitivity Jacobian matrix. Finally, we demonstrate fluorescence tomography on a realistic computational mouse model to locate nM to μM fluorophore concentration distributions in simulated mouse organs

  18. MULTILOOP PI CONTROLLER FOR ACHIEVING SIMULTANEOUS TIME AND FREQUENCY DOMAIN SPECIFICATIONS

    Directory of Open Access Journals (Sweden)

    M. SENTHILKUMAR

    2015-08-01

    Full Text Available Most of the controllers in control system are designed to satisfy either time domain or frequency domain specifications. This work presents the computation of a multiloop PI controller for achieving time and frequency domain specifications simultaneously. The desired time and frequency domain measures are to be specified initially to the design. To obtain the desired value of the performance measures the graphical relationship between the PI controller and the performance criteria is given. Thus by using graphical method a set of PI controller parameters to meet the desired performance measures are obtained in an effective and simpler way. The coupled tank has become a classic design of control engineering for multivariable process. The proposed control strategy has been implemented in the same coupled tank process and validated through simulation studies.

  19. Recommended documentation for computer users at ANL

    Energy Technology Data Exchange (ETDEWEB)

    Heiberger, A.A.

    1992-04-01

    Recommended Documentation for Computer Users at ANL is for all users of the services available from the Argonne National Laboratory (ANL) Computing and Telecommunications Division (CTD). This document will guide you in selecting available documentation that will best fill your particular needs. Chapter 1 explains how to use this document to select documents and how to obtain them from the CTD Document Distribution Counter. Chapter 2 contains a table that categorizes available publications. Chapter 3 gives descriptions of the online DOCUMENT command for CMS, and VAX, and the Sun workstation. DOCUMENT allows you to scan for and order documentation that interests you. Chapter 4 lists publications by subject. Categories I and IX cover publications of a general nature and publications on telecommunications and networks respectively. Categories II, III, IV, V, VI, VII, VIII, and X cover publications on specific computer systems. Category XI covers publications on advanced scientific computing at Argonne. Chapter 5 contains abstracts for each publication, all arranged alphabetically. Chapter 6 describes additional publications containing bibliographies and master indexes that the user may find useful. The appendix identifies available computer systems, applications, languages, and libraries.

  20. Computing in research and development in Africa benefits, trends, challenges and solutions

    CERN Document Server

    2015-01-01

    This book describes the trends, challenges and solutions in computing use for scientific research and development within different domains in Africa, such as health, agriculture, environment, economy, energy, education and engineering. The benefits expected are discussed by a number of recognized, domain-specific experts, with a common theme being computing as solution enabler. This book is the first document providing such a representative up-to-date view on this topic at the continent level.   • Discusses computing for scientific research and development on the African continent, addressing domains such as engineering, health, agriculture, environment, economy, energy, and education; • Describes the state-of-the-art in usage of computing to address problems in developing countries pertaining to health, productivity, economic growth, and renewable energy; • Offers insights applicable to all developing countries on the use of computing technologies to address a variety of societal issues.

  1. Efficient multiscale magnetic-domain analysis of iron-core material under mechanical stress

    Science.gov (United States)

    Nishikubo, Atsushi; Ito, Shumpei; Mifune, Takeshi; Matsuo, Tetsuji; Kaido, Chikara; Takahashi, Yasuhito; Fujiwara, Koji

    2018-05-01

    For an efficient analysis of magnetization, a partial-implicit solution method is improved using an assembled domain structure model with six-domain mesoscopic particles exhibiting pinning-type hysteresis. The quantitative analysis of non-oriented silicon steel succeeds in predicting the stress dependence of hysteresis loss with computation times greatly reduced by using the improved partial-implicit method. The effect of cell division along the thickness direction is also evaluated.

  2. RG domain wall for the general (su)-hat (2) coset models

    Energy Technology Data Exchange (ETDEWEB)

    Stanishkov, Marian [Institute for Nuclear Research and Nuclear Energy,Bulgarian Academy of Sciences, 1784 Sofia (Bulgaria)

    2016-08-16

    We consider a RG flow in a general (su)-hat (2) coset model induced by the least relevant field. This is done using two different approaches. We first compute the mixing coefficients of certain fields in the UV and IR theories using a conformal perturbation theory. The necessary structure constants are computed. The same coefficients can be calculated using the RG domain wall construction of Gaiotto. We compute the corresponding one-point functions and show that the two approaches give the same result in the leading order.

  3. Scaling properties of domain wall networks

    International Nuclear Information System (INIS)

    Leite, A. M. M.; Martins, C. J. A. P.

    2011-01-01

    We revisit the cosmological evolution of domain wall networks, taking advantage of recent improvements in computing power. We carry out high-resolution field theory simulations in two, three and four spatial dimensions to study the effects of dimensionality and damping on the evolution of the network. Our results are consistent with the expected scale-invariant evolution of the network, which suggests that previous hints of deviations from this behavior may have been due to the limited dynamical range of those simulations. We also use the results of very large (1024 3 ) simulations in three cosmological epochs to provide a calibration for the velocity-dependent one-scale model for domain walls: we numerically determine the two free model parameters to have the values c w =0.5±0.2 and k w =1.1±0.3.

  4. Modeling Network Traffic in Wavelet Domain

    Directory of Open Access Journals (Sweden)

    Sheng Ma

    2004-12-01

    Full Text Available This work discovers that although network traffic has the complicated short- and long-range temporal dependence, the corresponding wavelet coefficients are no longer long-range dependent. Therefore, a "short-range" dependent process can be used to model network traffic in the wavelet domain. Both independent and Markov models are investigated. Theoretical analysis shows that the independent wavelet model is sufficiently accurate in terms of the buffer overflow probability for Fractional Gaussian Noise traffic. Any model, which captures additional correlations in the wavelet domain, only improves the performance marginally. The independent wavelet model is then used as a unified approach to model network traffic including VBR MPEG video and Ethernet data. The computational complexity is O(N for developing such wavelet models and generating synthesized traffic of length N, which is among the lowest attained.

  5. Public Services 2.0: The Impact of Social Computing on Public Services

    OpenAIRE

    Punie, Y.; Misuraca, G.; Osimo, D.; Huijboom, N.; Broek, T.A. van den; Frissen, V.; Kool, L.

    2010-01-01

    Since 2003, the Internet has seen impressive growth in user-driven applications such as blogs, podcasts, wikis and social networking sites. This trend is referred to here as ‘social computing’ as online applications increasingly support the creation of value by social networks of people. The social computing trend has been recognised and monitored by the Institute for Prospective and Technological Studies (IPTS) over the past few years. IPTS observed a viral take up of social computing applic...

  6. Computer vision and imaging in intelligent transportation systems

    CERN Document Server

    Bala, Raja; Trivedi, Mohan

    2017-01-01

    Acts as a single source reference providing readers with an overview of how computer vision can contribute to the different applications in the field of road transportation. This book presents a survey of computer vision techniques related to three key broad problems in the roadway transportation domain: safety, efficiency, and law enforcement. The individual chapters present significant applications within these problem domains, each presented in a tutorial manner, describing the motivation for and benefits of the application, and a description of the state of the art.

  7. Perturbative evolution of particle orbits around Kerr black holes: time-domain calculation

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Aleman, Ramon [Physical Sciences Department, University of Puerto Rico-Rio Piedras, San Juan, PR 00931 (Puerto Rico); Khanna, Gaurav [Natural Science Division, Long Island University, Southampton, NY 11968 (United States); Pullin, Jorge [Department of Physics and Astronomy, Louisiana State University, 202 Nicholson Hall, Baton Rouge, LA 70803-4001 (United States)

    2003-07-21

    We consider the problem of the gravitational waves produced by a particle of negligible mass orbiting a Kerr black hole. We treat the Teukolsky perturbation equation in the time domain numerically as a 2 + 1 partial differential equation. We model the particle by smearing the singularities in the source term by the use of narrow Gaussian distributions. We have been able to reproduce earlier results for equatorial circular orbits that were computed using the frequency-domain formalism. The time-domain approach is however geared for a more general evolution, for instance of nearly geodesic orbits under the effects of radiation reaction.

  8. Perturbative evolution of particle orbits around Kerr black holes: time-domain calculation

    International Nuclear Information System (INIS)

    Lopez-Aleman, Ramon; Khanna, Gaurav; Pullin, Jorge

    2003-01-01

    We consider the problem of the gravitational waves produced by a particle of negligible mass orbiting a Kerr black hole. We treat the Teukolsky perturbation equation in the time domain numerically as a 2 + 1 partial differential equation. We model the particle by smearing the singularities in the source term by the use of narrow Gaussian distributions. We have been able to reproduce earlier results for equatorial circular orbits that were computed using the frequency-domain formalism. The time-domain approach is however geared for a more general evolution, for instance of nearly geodesic orbits under the effects of radiation reaction

  9. A domain-based approach to predict protein-protein interactions

    Directory of Open Access Journals (Sweden)

    Resat Haluk

    2007-06-01

    Full Text Available Abstract Background Knowing which proteins exist in a certain organism or cell type and how these proteins interact with each other are necessary for the understanding of biological processes at the whole cell level. The determination of the protein-protein interaction (PPI networks has been the subject of extensive research. Despite the development of reasonably successful methods, serious technical difficulties still exist. In this paper we present DomainGA, a quantitative computational approach that uses the information about the domain-domain interactions to predict the interactions between proteins. Results DomainGA is a multi-parameter optimization method in which the available PPI information is used to derive a quantitative scoring scheme for the domain-domain pairs. Obtained domain interaction scores are then used to predict whether a pair of proteins interacts. Using the yeast PPI data and a series of tests, we show the robustness and insensitivity of the DomainGA method to the selection of the parameter sets, score ranges, and detection rules. Our DomainGA method achieves very high explanation ratios for the positive and negative PPIs in yeast. Based on our cross-verification tests on human PPIs, comparison of the optimized scores with the structurally observed domain interactions obtained from the iPFAM database, and sensitivity and specificity analysis; we conclude that our DomainGA method shows great promise to be applicable across multiple organisms. Conclusion We envision the DomainGA as a first step of a multiple tier approach to constructing organism specific PPIs. As it is based on fundamental structural information, the DomainGA approach can be used to create potential PPIs and the accuracy of the constructed interaction template can be further improved using complementary methods. Explanation ratios obtained in the reported test case studies clearly show that the false prediction rates of the template networks constructed

  10. Integrating cross-scale analysis in the spatial and temporal domains for classification of behavioral movement

    Directory of Open Access Journals (Sweden)

    Ali Soleymani

    2014-06-01

    Full Text Available Since various behavioral movement patterns are likely to be valid within different, unique ranges of spatial and temporal scales (e.g., instantaneous, diurnal, or seasonal with the corresponding spatial extents, a cross-scale approach is needed for accurate classification of behaviors expressed in movement. Here, we introduce a methodology for the characterization and classification of behavioral movement data that relies on computing and analyzing movement features jointly in both the spatial and temporal domains. The proposed methodology consists of three stages. In the first stage, focusing on the spatial domain, the underlying movement space is partitioned into several zonings that correspond to different spatial scales, and features related to movement are computed for each partitioning level. In the second stage, concentrating on the temporal domain, several movement parameters are computed from trajectories across a series of temporal windows of increasing sizes, yielding another set of input features for the classification. For both the spatial and the temporal domains, the ``reliable scale'' is determined by an automated procedure. This is the scale at which the best classification accuracy is achieved, using only spatial or temporal input features, respectively. The third stage takes the measures from the spatial and temporal domains of movement, computed at the corresponding reliable scales, as input features for behavioral classification. With a feature selection procedure, the most relevant features contributing to known behavioral states are extracted and used to learn a classification model. The potential of the proposed approach is demonstrated on a dataset of adult zebrafish (Danio rerio swimming movements in testing tanks, following exposure to different drug treatments. Our results show that behavioral classification accuracy greatly increases when firstly cross-scale analysis is used to determine the best analysis scale, and

  11. The GLEaMviz computational tool, a publicly available software to explore realistic epidemic spreading scenarios at the global scale

    Directory of Open Access Journals (Sweden)

    Quaggiotto Marco

    2011-02-01

    Full Text Available Abstract Background Computational models play an increasingly important role in the assessment and control of public health crises, as demonstrated during the 2009 H1N1 influenza pandemic. Much research has been done in recent years in the development of sophisticated data-driven models for realistic computer-based simulations of infectious disease spreading. However, only a few computational tools are presently available for assessing scenarios, predicting epidemic evolutions, and managing health emergencies that can benefit a broad audience of users including policy makers and health institutions. Results We present "GLEaMviz", a publicly available software system that simulates the spread of emerging human-to-human infectious diseases across the world. The GLEaMviz tool comprises three components: the client application, the proxy middleware, and the simulation engine. The latter two components constitute the GLEaMviz server. The simulation engine leverages on the Global Epidemic and Mobility (GLEaM framework, a stochastic computational scheme that integrates worldwide high-resolution demographic and mobility data to simulate disease spread on the global scale. The GLEaMviz design aims at maximizing flexibility in defining the disease compartmental model and configuring the simulation scenario; it allows the user to set a variety of parameters including: compartment-specific features, transition values, and environmental effects. The output is a dynamic map and a corresponding set of charts that quantitatively describe the geo-temporal evolution of the disease. The software is designed as a client-server system. The multi-platform client, which can be installed on the user's local machine, is used to set up simulations that will be executed on the server, thus avoiding specific requirements for large computational capabilities on the user side. Conclusions The user-friendly graphical interface of the GLEaMviz tool, along with its high level

  12. A non-local computational boundary condition for duct acoustics

    Science.gov (United States)

    Zorumski, William E.; Watson, Willie R.; Hodge, Steve L.

    1994-01-01

    A non-local boundary condition is formulated for acoustic waves in ducts without flow. The ducts are two dimensional with constant area, but with variable impedance wall lining. Extension of the formulation to three dimensional and variable area ducts is straightforward in principle, but requires significantly more computation. The boundary condition simulates a nonreflecting wave field in an infinite duct. It is implemented by a constant matrix operator which is applied at the boundary of the computational domain. An efficient computational solution scheme is developed which allows calculations for high frequencies and long duct lengths. This computational solution utilizes the boundary condition to limit the computational space while preserving the radiation boundary condition. The boundary condition is tested for several sources. It is demonstrated that the boundary condition can be applied close to the sound sources, rendering the computational domain small. Computational solutions with the new non-local boundary condition are shown to be consistent with the known solutions for nonreflecting wavefields in an infinite uniform duct.

  13. Sampling Approaches for Multi-Domain Internet Performance Measurement Infrastructures

    Energy Technology Data Exchange (ETDEWEB)

    Calyam, Prasad

    2014-09-15

    The next-generation of high-performance networks being developed in DOE communities are critical for supporting current and emerging data-intensive science applications. The goal of this project is to investigate multi-domain network status sampling techniques and tools to measure/analyze performance, and thereby provide “network awareness” to end-users and network operators in DOE communities. We leverage the infrastructure and datasets available through perfSONAR, which is a multi-domain measurement framework that has been widely deployed in high-performance computing and networking communities; the DOE community is a core developer and the largest adopter of perfSONAR. Our investigations include development of semantic scheduling algorithms, measurement federation policies, and tools to sample multi-domain and multi-layer network status within perfSONAR deployments. We validate our algorithms and policies with end-to-end measurement analysis tools for various monitoring objectives such as network weather forecasting, anomaly detection, and fault-diagnosis. In addition, we develop a multi-domain architecture for an enterprise-specific perfSONAR deployment that can implement monitoring-objective based sampling and that adheres to any domain-specific measurement policies.

  14. Public health program capacity for sustainability: a new framework.

    Science.gov (United States)

    Schell, Sarah F; Luke, Douglas A; Schooley, Michael W; Elliott, Michael B; Herbers, Stephanie H; Mueller, Nancy B; Bunger, Alicia C

    2013-02-01

    Public health programs can only deliver benefits if they are able to sustain activities over time. There is a broad literature on program sustainability in public health, but it is fragmented and there is a lack of consensus on core constructs. The purpose of this paper is to present a new conceptual framework for program sustainability in public health. This developmental study uses a comprehensive literature review, input from an expert panel, and the results of concept-mapping to identify the core domains of a conceptual framework for public health program capacity for sustainability. The concept-mapping process included three types of participants (scientists, funders, and practitioners) from several public health areas (e.g., tobacco control, heart disease and stroke, physical activity and nutrition, and injury prevention). The literature review identified 85 relevant studies focusing on program sustainability in public health. Most of the papers described empirical studies of prevention-oriented programs aimed at the community level. The concept-mapping process identified nine core domains that affect a program's capacity for sustainability: Political Support, Funding Stability, Partnerships, Organizational Capacity, Program Evaluation, Program Adaptation, Communications, Public Health Impacts, and Strategic Planning. Concept-mapping participants further identified 93 items across these domains that have strong face validity-89% of the individual items composing the framework had specific support in the sustainability literature. The sustainability framework presented here suggests that a number of selected factors may be related to a program's ability to sustain its activities and benefits over time. These factors have been discussed in the literature, but this framework synthesizes and combines the factors and suggests how they may be interrelated with one another. The framework presents domains for public health decision makers to consider when developing

  15. The domain theory: patterns for knowledge and software reuse

    National Research Council Canada - National Science Library

    Sutcliffe, Alistair

    2002-01-01

    ..., retrieval system, or any other means, without prior written permission of the publisher. Lawrence Erlbaum Associates, Inc., Publishers 10 Industrial Avenue Mahwah, New Jersey 07430 Library of Congress Cataloging-in-Publication Data Sutcliffe, Alistair, 1951- The domain theory : patterns for knowledge and software reuse / Alistair Sutcl...

  16. Structure problems in the analog computation; Problemes de structure dans le calcul analogique

    Energy Technology Data Exchange (ETDEWEB)

    Braffort, P.L. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1957-07-01

    The recent mathematical development showed the importance of elementary structures (algebraic, topological, etc.) in abeyance under the great domains of classical analysis. Such structures in analog computation are put in evidence and possible development of applied mathematics are discussed. It also studied the topological structures of the standard representation of analog schemes such as additional triangles, integrators, phase inverters and functions generators. The analog method gives only the function of the variable: time, as results of its computations. But the course of computation, for systems including reactive circuits, introduces order structures which are called 'chronological'. Finally, it showed that the approximation methods of ordinary numerical and digital computation present the same structure as these analog computation. The structure analysis permits fruitful comparisons between the several domains of applied mathematics and suggests new important domains of application for analog method. (M.P.)

  17. Structure problems in the analog computation; Problemes de structure dans le calcul analogique

    Energy Technology Data Exchange (ETDEWEB)

    Braffort, P L [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1957-07-01

    The recent mathematical development showed the importance of elementary structures (algebraic, topological, etc.) in abeyance under the great domains of classical analysis. Such structures in analog computation are put in evidence and possible development of applied mathematics are discussed. It also studied the topological structures of the standard representation of analog schemes such as additional triangles, integrators, phase inverters and functions generators. The analog method gives only the function of the variable: time, as results of its computations. But the course of computation, for systems including reactive circuits, introduces order structures which are called 'chronological'. Finally, it showed that the approximation methods of ordinary numerical and digital computation present the same structure as these analog computation. The structure analysis permits fruitful comparisons between the several domains of applied mathematics and suggests new important domains of application for analog method. (M.P.)

  18. A domain sequence approach to pangenomics: applications to Escherichia coli [v1; ref status: indexed, http://f1000r.es/QSnDE6

    Directory of Open Access Journals (Sweden)

    Lars-Gustav Snipen

    2012-10-01

    Full Text Available The study of microbial pangenomes relies on the computation of gene families, i.e. the clustering of coding sequences into groups of essentially similar genes. There is no standard approach to obtain such gene families. Ideally, the gene family computations should be robust against errors in the annotation of genes in various genomes. In an attempt to achieve this robustness, we propose to cluster sequences by their domain sequence, i.e. the ordered sequence of domains in their protein sequence. In a study of 347 genomes from Escherichia coli we find on average around 4500 proteins having hits in Pfam-A in every genome, clustering into around 2500 distinct domain sequence families in each genome. Across all genomes we find a total of 5724 such families. A binomial mixture model approach indicates this is around 95% of all domain sequences we would expect to see in E. coli in the future. A Heaps law analysis indicates the population of domain sequences is larger, but this analysis is also very sensitive to smaller changes in the computation procedure. The resolution between strains is good despite the coarse grouping obtained by domain sequence families. Clustering sequences by their ordered domain content give us domain sequence families, who are robust to errors in the gene prediction step. The computational load of the procedure scales linearly with the number of genomes, which is needed for the future explosion in the number of re-sequenced strains. The use of domain sequence families for a functional classification of strains clearly has some potential to be explored.

  19. A note on domains of discourse. Logical know-how for integrated environmental modelling

    Energy Technology Data Exchange (ETDEWEB)

    Gerstengarbe, F.W. (ed.); Jaeger, C.C.

    2003-10-01

    Building computer models means implementing a mathematical structure on a piece of hardware in such a way that insights about some other phenomenon can be gained, remembered and communicated. For meaningful computer modelling, the phenomenon to be modelled must be described in a logically coherent way. This can be quite difficult, especially when a combination of highly heterogeneous scientific disciplines is needed, as is often the case in environmental research. The paper shows how the notion of a domain of discourse as developed by logicians can be used to map out the cognitive landscape of integrated modelling. This landscape is not a fixed universe, but a multiverse resonating with an evolving pluralism of domains of discourse. Integrated modelling involves a never-ending activity of translation between such domains, an activity that often goes hand in hand with major efforts to overcome conceptual confusions within given domains. For these purposes, a careful use of mathematics, including tools of formal logic presented in the paper, can be helpful. The concept of vulnerability as currently used in global change research is discussed as an example of the challenges to be met in integrated environmental modelling. (orig.)

  20. Unbound motion on a Schwarzschild background: Practical approaches to frequency domain computations

    Science.gov (United States)

    Hopper, Seth

    2018-03-01

    Gravitational perturbations due to a point particle moving on a static black hole background are naturally described in Regge-Wheeler gauge. The first-order field equations reduce to a single master wave equation for each radiative mode. The master function satisfying this wave equation is a linear combination of the metric perturbation amplitudes with a source term arising from the stress-energy tensor of the point particle. The original master functions were found by Regge and Wheeler (odd parity) and Zerilli (even parity). Subsequent work by Moncrief and then Cunningham, Price and Moncrief introduced new master variables which allow time domain reconstruction of the metric perturbation amplitudes. Here, I explore the relationship between these different functions and develop a general procedure for deriving new higher-order master functions from ones already known. The benefit of higher-order functions is that their source terms always converge faster at large distance than their lower-order counterparts. This makes for a dramatic improvement in both the speed and accuracy of frequency domain codes when analyzing unbound motion.

  1. Computational Design of High-χ Block Oligomers for Accessing 1 nm Domains.

    Science.gov (United States)

    Chen, Qile P; Barreda, Leonel; Oquendo, Luis E; Hillmyer, Marc A; Lodge, Timothy P; Siepmann, J Ilja

    2018-05-22

    Molecular dynamics simulations are used to design a series of high-χ block oligomers (HCBOs) that can self-assemble into a variety of mesophases with domain sizes as small as 1 nm. The exploration of these oligomers with various chain lengths, volume fractions, and chain architectures at multiple temperatures reveals the presence of ordered lamellae, perforated lamellae, and hexagonally packed cylinders. The achieved periods are as small as 3.0 and 2.1 nm for lamellae and cylinders, respectively, which correspond to polar domains of approximately 1 nm. Interestingly, the detailed phase behavior of these oligomers is distinct from that of either solvent-free surfactants or block polymers. The simulations reveal that the behavior of these HCBOs is a product of an interplay between both "surfactant factors" (headgroup interactions, chain flexibility, and interfacial curvature) and "block polymer factors" (χ, chain length N, and volume fraction f). This insight promotes the understanding of molecular features pivotal for mesophase formation at the sub-5 nm length scale, which facilitates the design of HCBOs tailored toward particular desired morphologies.

  2. Common-image gathers in the offset domain from reverse-time migration

    KAUST Repository

    Zhan, Ge

    2014-04-01

    Kirchhoff migration is flexible to output common-image gathers (CIGs) in the offset domain by imaging data with different offsets separately. These CIGs supply important information for velocity model updates and amplitude-variation-with-offset (AVO) analysis. Reverse-time migration (RTM) offers more insights into complex geology than Kirchhoff migration by accurately describing wave propagation using the two-way wave equation. But, it has difficulty to produce offset domain CIGs like Kirchhoff migration. In this paper, we develop a method for obtaining offset domain CIGs from RTM. The method first computes the RTM operator of an offset gather, followed by a dot product of the operator and the offset data to form a common-offset RTM image. The offset domain CIGs are then achieved after separately migrating data with different offsets. We generate offset domain CIGs on both the Marmousi synthetic data and 2D Gulf of Mexico real data using this approach. © 2014.

  3. Fast Convolutional Sparse Coding in the Dual Domain

    KAUST Repository

    Affara, Lama Ahmed; Ghanem, Bernard; Wonka, Peter

    2017-01-01

    Convolutional sparse coding (CSC) is an important building block of many computer vision applications ranging from image and video compression to deep learning. We present two contributions to the state of the art in CSC. First, we significantly speed up the computation by proposing a new optimization framework that tackles the problem in the dual domain. Second, we extend the original formulation to higher dimensions in order to process a wider range of inputs, such as color inputs, or HOG features. Our results show a significant speedup compared to the current state of the art in CSC.

  4. Fast Convolutional Sparse Coding in the Dual Domain

    KAUST Repository

    Affara, Lama Ahmed

    2017-09-27

    Convolutional sparse coding (CSC) is an important building block of many computer vision applications ranging from image and video compression to deep learning. We present two contributions to the state of the art in CSC. First, we significantly speed up the computation by proposing a new optimization framework that tackles the problem in the dual domain. Second, we extend the original formulation to higher dimensions in order to process a wider range of inputs, such as color inputs, or HOG features. Our results show a significant speedup compared to the current state of the art in CSC.

  5. Computationally Modeling Interpersonal Trust

    Directory of Open Access Journals (Sweden)

    Jin Joo eLee

    2013-12-01

    Full Text Available We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naivete' of this domain knowledge. We then present the construction of hidden Markov models to incorporate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  6. AMPLIFICAREA RELAłIEI AUDITULUI PUBLIC INTERN CU AUDITUL PUBLIC EXTERN

    Directory of Open Access Journals (Sweden)

    Ioan Alexandru Szabo

    2011-01-01

    Full Text Available The paper deals with the issue of functional relations between the structure of theinternal audit and external audit in the public domain. The introduction isaddressed to international standards and regulatory framework that stipulates theneed for consolidation of the cooperation relations between the two categories ofprofessionals, namely, internal auditors and external public auditors. The authoremphasizes the importance of developing and of signing the Protocol ofcooperation and collaboration between the two audit authorities that are the Courtof Accounts and The Public Finance that support the professional partnership. Thearticle also presents the terms of the Protocol signed between external audit andinternal audit in the public sector. Finally, it is concluded that the increasing therole of internal audit in the implementation and development of management andinternal control system within public entities, is conditioned by the functionality ofthe collaboration and cooperation relationships between the two audits.

  7. Domain-enhanced analysis of microarray data using GO annotations.

    Science.gov (United States)

    Liu, Jiajun; Hughes-Oliver, Jacqueline M; Menius, J Alan

    2007-05-15

    New biological systems technologies give scientists the ability to measure thousands of bio-molecules including genes, proteins, lipids and metabolites. We use domain knowledge, e.g. the Gene Ontology, to guide analysis of such data. By focusing on domain-aggregated results at, say the molecular function level, increased interpretability is available to biological scientists beyond what is possible if results are presented at the gene level. We use a 'top-down' approach to perform domain aggregation by first combining gene expressions before testing for differentially expressed patterns. This is in contrast to the more standard 'bottom-up' approach, where genes are first tested individually then aggregated by domain knowledge. The benefits are greater sensitivity for detecting signals. Our method, domain-enhanced analysis (DEA) is assessed and compared to other methods using simulation studies and analysis of two publicly available leukemia data sets. Our DEA method uses functions available in R (http://www.r-project.org/) and SAS (http://www.sas.com/). The two experimental data sets used in our analysis are available in R as Bioconductor packages, 'ALL' and 'golubEsets' (http://www.bioconductor.org/). Supplementary data are available at Bioinformatics online.

  8. Women in biomedical engineering and health informatics and its impact on gender representation for accepted publications at IEEE EMBC 2007.

    Science.gov (United States)

    McGregor, Carolyn; Smith, Kathleen P; Percival, Jennifer

    2008-01-01

    The study of women within the professions of Engineering and Computer Science has consistently been found to demonstrate women as a minority within these professions. However none of that previous work has assessed publication behaviours based on gender. This paper presents research findings on gender distribution of authors of accepted papers for the IEEE Engineering and Medicine Society annual conference for 2007 (EMBC '07) held in Lyon, France. This information is used to present a position statement of the current state of gender representation for conference publication within the domain of biomedical engineering and health informatics. Issues in data preparation resulting from the lack of inclusion of gender in information gathered from accepted authors are presented and discussed.

  9. Subject-enabled analytics model on measurement statistics in health risk expert system for public health informatics.

    Science.gov (United States)

    Chung, Chi-Jung; Kuo, Yu-Chen; Hsieh, Yun-Yu; Li, Tsai-Chung; Lin, Cheng-Chieh; Liang, Wen-Miin; Liao, Li-Na; Li, Chia-Ing; Lin, Hsueh-Chun

    2017-11-01

    This study applied open source technology to establish a subject-enabled analytics model that can enhance measurement statistics of case studies with the public health data in cloud computing. The infrastructure of the proposed model comprises three domains: 1) the health measurement data warehouse (HMDW) for the case study repository, 2) the self-developed modules of online health risk information statistics (HRIStat) for cloud computing, and 3) the prototype of a Web-based process automation system in statistics (PASIS) for the health risk assessment of case studies with subject-enabled evaluation. The system design employed freeware including Java applications, MySQL, and R packages to drive a health risk expert system (HRES). In the design, the HRIStat modules enforce the typical analytics methods for biomedical statistics, and the PASIS interfaces enable process automation of the HRES for cloud computing. The Web-based model supports both modes, step-by-step analysis and auto-computing process, respectively for preliminary evaluation and real time computation. The proposed model was evaluated by computing prior researches in relation to the epidemiological measurement of diseases that were caused by either heavy metal exposures in the environment or clinical complications in hospital. The simulation validity was approved by the commercial statistics software. The model was installed in a stand-alone computer and in a cloud-server workstation to verify computing performance for a data amount of more than 230K sets. Both setups reached efficiency of about 10 5 sets per second. The Web-based PASIS interface can be used for cloud computing, and the HRIStat module can be flexibly expanded with advanced subjects for measurement statistics. The analytics procedure of the HRES prototype is capable of providing assessment criteria prior to estimating the potential risk to public health. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Marrying Content and Process in Computer Science Education

    Science.gov (United States)

    Zendler, A.; Spannagel, C.; Klaudt, D.

    2011-01-01

    Constructivist approaches to computer science education emphasize that as well as knowledge, thinking skills and processes are involved in active knowledge construction. K-12 computer science curricula must not be based on fashions and trends, but on contents and processes that are observable in various domains of computer science, that can be…

  11. Supporting students' learning in the domain of computer science

    Science.gov (United States)

    Gasparinatou, Alexandra; Grigoriadou, Maria

    2011-03-01

    Previous studies have shown that students with low knowledge understand and learn better from more cohesive texts, whereas high-knowledge students have been shown to learn better from texts of lower cohesion. This study examines whether high-knowledge readers in computer science benefit from a text of low cohesion. Undergraduate students (n = 65) read one of four versions of a text concerning Local Network Topologies, orthogonally varying local and global cohesion. Participants' comprehension was examined through free-recall measure, text-based, bridging-inference, elaborative-inference, problem-solving questions and a sorting task. The results indicated that high-knowledge readers benefited from the low-cohesion text. The interaction of text cohesion and knowledge was reliable for the sorting activity, for elaborative-inference and for problem-solving questions. Although high-knowledge readers performed better in text-based and in bridging-inference questions with the low-cohesion text, the interaction of text cohesion and knowledge was not reliable. The results suggest a more complex view of when and for whom textual cohesion affects comprehension and consequently learning in computer science.

  12. Numerical simulation of electromagnetic wave propagation using time domain meshless method

    International Nuclear Information System (INIS)

    Ikuno, Soichiro; Fujita, Yoshihisa; Itoh, Taku; Nakata, Susumu; Nakamura, Hiroaki; Kamitani, Atsushi

    2012-01-01

    The electromagnetic wave propagation in various shaped wave guide is simulated by using meshless time domain method (MTDM). Generally, Finite Differential Time Domain (FDTD) method is applied for electromagnetic wave propagation simulation. However, the numerical domain should be divided into rectangle meshes if FDTD method is applied for the simulation. On the other hand, the node disposition of MTDM can easily describe the structure of arbitrary shaped wave guide. This is the large advantage of the meshless time domain method. The results of computations show that the damping rate is stably calculated in case with R < 0.03, where R denotes a support radius of the weight function for the shape function. And the results indicate that the support radius R of the weight functions should be selected small, and monomials must be used for calculating the shape functions. (author)

  13. A domain-decomposition method to implement electrostatic free boundary conditions in the radial direction for electric discharges

    Science.gov (United States)

    Malagón-Romero, A.; Luque, A.

    2018-04-01

    At high pressure electric discharges typically grow as thin, elongated filaments. In a numerical simulation this large aspect ratio should ideally translate into a narrow, cylindrical computational domain that envelops the discharge as closely as possible. However, the development of the discharge is driven by electrostatic interactions and, if the computational domain is not wide enough, the boundary conditions imposed to the electrostatic potential on the external boundary have a strong effect on the discharge. Most numerical codes circumvent this problem by either using a wide computational domain or by calculating the boundary conditions by integrating the Green's function of an infinite domain. Here we describe an accurate and efficient method to impose free boundary conditions in the radial direction for an elongated electric discharge. To facilitate the use of our method we provide a sample implementation. Finally, we apply the method to solve Poisson's equation in cylindrical coordinates with free boundary conditions in both radial and longitudinal directions. This case is of particular interest for the initial stages of discharges in long gaps or natural discharges in the atmosphere, where it is not practical to extend the simulation volume to be bounded by two electrodes.

  14. Seismic response of three-dimensional topographies using a time-domain boundary element method

    Science.gov (United States)

    Janod, François; Coutant, Olivier

    2000-08-01

    We present a time-domain implementation for a boundary element method (BEM) to compute the diffraction of seismic waves by 3-D topographies overlying a homogeneous half-space. This implementation is chosen to overcome the memory limitations arising when solving the boundary conditions with a frequency-domain approach. This formulation is flexible because it allows one to make an adaptive use of the Green's function time translation properties: the boundary conditions solving scheme can be chosen as a trade-off between memory and cpu requirements. We explore here an explicit method of solution that requires little memory but a high cpu cost in order to run on a workstation computer. We obtain good results with four points per minimum wavelength discretization for various topographies and plane wave excitations. This implementation can be used for two different aims: the time-domain approach allows an easier implementation of the BEM in hybrid methods (e.g. coupling with finite differences), and it also allows one to run simple BEM models with reasonable computer requirements. In order to keep reasonable computation times, we do not introduce any interface and we only consider homogeneous models. Results are shown for different configurations: an explosion near a flat free surface, a plane wave vertically incident on a Gaussian hill and on a hemispherical cavity, and an explosion point below the surface of a Gaussian hill. Comparison is made with other numerical methods, such as finite difference methods (FDMs) and spectral elements.

  15. Frequency-Domain Adaptive Algorithm for Network Echo Cancellation in VoIP

    Directory of Open Access Journals (Sweden)

    Patrick A. Naylor

    2008-05-01

    Full Text Available We propose a new low complexity, low delay, and fast converging frequency-domain adaptive algorithm for network echo cancellation in VoIP exploiting MMax and sparse partial (SP tap-selection criteria in the frequency domain. We incorporate these tap-selection techniques into the multidelay filtering (MDF algorithm in order to mitigate the delay inherent in frequency-domain algorithms. We illustrate two such approaches and discuss their tradeoff between convergence performance and computational complexity. Simulation results show an improvement in convergence rate for the proposed algorithm over MDF and significantly reduced complexity. The proposed algorithm achieves a convergence performance close to that of the recently proposed, but substantially more complex improved proportionate MDF (IPMDF algorithm.

  16. Cloud Computing: An Overview

    Science.gov (United States)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  17. Numerical discrepancy between serial and MPI parallel computations

    Directory of Open Access Journals (Sweden)

    Sang Bong Lee

    2016-09-01

    Full Text Available Numerical simulations of 1D Burgers equation and 2D sloshing problem were carried out to study numerical discrepancy between serial and parallel computations. The numerical domain was decomposed into 2 and 4 subdomains for parallel computations with message passing interface. The numerical solution of Burgers equation disclosed that fully explicit boundary conditions used on subdomains of parallel computation was responsible for the numerical discrepancy of transient solution between serial and parallel computations. Two dimensional sloshing problems in a rectangular domain were solved using OpenFOAM. After a lapse of initial transient time sloshing patterns of water were significantly different in serial and parallel computations although the same numerical conditions were given. Based on the histograms of pressure measured at two points near the wall the statistical characteristics of numerical solution was not affected by the number of subdomains as much as the transient solution was dependent on the number of subdomains.

  18. Pascal-SC a computer language for scientific computation

    CERN Document Server

    Bohlender, Gerd; von Gudenberg, Jürgen Wolff; Rheinboldt, Werner; Siewiorek, Daniel

    1987-01-01

    Perspectives in Computing, Vol. 17: Pascal-SC: A Computer Language for Scientific Computation focuses on the application of Pascal-SC, a programming language developed as an extension of standard Pascal, in scientific computation. The publication first elaborates on the introduction to Pascal-SC, a review of standard Pascal, and real floating-point arithmetic. Discussions focus on optimal scalar product, standard functions, real expressions, program structure, simple extensions, real floating-point arithmetic, vector and matrix arithmetic, and dynamic arrays. The text then examines functions a

  19. Mesh adaptation technique for Fourier-domain fluorescence lifetime imaging

    International Nuclear Information System (INIS)

    Soloviev, Vadim Y.

    2006-01-01

    A novel adaptive mesh technique in the Fourier domain is introduced for problems in fluorescence lifetime imaging. A dynamical adaptation of the three-dimensional scheme based on the finite volume formulation reduces computational time and balances the ill-posed nature of the inverse problem. Light propagation in the medium is modeled by the telegraph equation, while the lifetime reconstruction algorithm is derived from the Fredholm integral equation of the first kind. Stability and computational efficiency of the method are demonstrated by image reconstruction of two spherical fluorescent objects embedded in a tissue phantom

  20. Domain decomposition and multilevel integration for fermions

    International Nuclear Information System (INIS)

    Ce, Marco; Giusti, Leonardo; Schaefer, Stefan

    2016-01-01

    The numerical computation of many hadronic correlation functions is exceedingly difficult due to the exponentially decreasing signal-to-noise ratio with the distance between source and sink. Multilevel integration methods, using independent updates of separate regions in space-time, are known to be able to solve such problems but have so far been available only for pure gauge theory. We present first steps into the direction of making such integration schemes amenable to theories with fermions, by factorizing a given observable via an approximated domain decomposition of the quark propagator. This allows for multilevel integration of the (large) factorized contribution to the observable, while its (small) correction can be computed in the standard way.

  1. Improvement in Protein Domain Identification Is Reached by Breaking Consensus, with the Agreement of Many Profiles and Domain Co-occurrence.

    Directory of Open Access Journals (Sweden)

    Juliana Bernardes

    2016-07-01

    Full Text Available Traditional protein annotation methods describe known domains with probabilistic models representing consensus among homologous domain sequences. However, when relevant signals become too weak to be identified by a global consensus, attempts for annotation fail. Here we address the fundamental question of domain identification for highly divergent proteins. By using high performance computing, we demonstrate that the limits of state-of-the-art annotation methods can be bypassed. We design a new strategy based on the observation that many structural and functional protein constraints are not globally conserved through all species but might be locally conserved in separate clades. We propose a novel exploitation of the large amount of data available: 1. for each known protein domain, several probabilistic clade-centered models are constructed from a large and differentiated panel of homologous sequences, 2. a decision-making protocol combines outcomes obtained from multiple models, 3. a multi-criteria optimization algorithm finds the most likely protein architecture. The method is evaluated for domain and architecture prediction over several datasets and statistical testing hypotheses. Its performance is compared against HMMScan and HHblits, two widely used search methods based on sequence-profile and profile-profile comparison. Due to their closeness to actual protein sequences, clade-centered models are shown to be more specific and functionally predictive than the broadly used consensus models. Based on them, we improved annotation of Plasmodium falciparum protein sequences on a scale not previously possible. We successfully predict at least one domain for 72% of P. falciparum proteins against 63% achieved previously, corresponding to 30% of improvement over the total number of Pfam domain predictions on the whole genome. The method is applicable to any genome and opens new avenues to tackle evolutionary questions such as the reconstruction of

  2. Computer-aided visualization of database structural relationships

    International Nuclear Information System (INIS)

    Cahn, D.F.

    1980-04-01

    Interactive computer graphic displays can be extremely useful in augmenting understandability of data structures. In complexly interrelated domains such as bibliographic thesauri and energy information systems, node and link displays represent one such tool. This paper presents examples of data structure representations found useful in these domains and discusses some of their generalizable components. 2 figures

  3. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  4. Recent trends in computational photonics

    CERN Document Server

    Benson, Trevor; Rue, Richard; Wurtz, Gregory

    2017-01-01

    This book brings together the recent cutting-edge work on computational methods in photonics and their applications. The latest advances in techniques such as the Discontinuous Galerkin Time Domain method, Finite Element Time Domain method, Finite Difference Time Domain method as well as their applications are presented. Key aspects such as modelling of non-linear effects (Second Harmonic Generation, lasing in fibers, including gain nonlinearity in metamaterials), the acousto-optic effect, and the hydrodynamic model to explain electron response in nanoplasmonic structures are included. The application areas covered include plasmonics, metamaterials, photonic crystals, dielectric waveguides, fiber lasers. The chapters give a representative survey of the corresponding area. .

  5. Domain Decomposition for Computing Extremely Low Frequency Induced Current in the Human Body

    OpenAIRE

    Perrussel , Ronan; Voyer , Damien; Nicolas , Laurent; Scorretti , Riccardo; Burais , Noël

    2011-01-01

    International audience; Computation of electromagnetic fields in high resolution computational phantoms requires solving large linear systems. We present an application of Schwarz preconditioners with Krylov subspace methods for computing extremely low frequency induced fields in a phantom issued from the Visible Human.

  6. Permeability computation on a REV with an immersed finite element method

    International Nuclear Information System (INIS)

    Laure, P.; Puaux, G.; Silva, L.; Vincent, M.

    2011-01-01

    An efficient method to compute permeability of fibrous media is presented. An immersed domain approach is used to represent the porous material at its microscopic scale and the flow motion is computed with a stabilized mixed finite element method. Therefore the Stokes equation is solved on the whole domain (including solid part) using a penalty method. The accuracy is controlled by refining the mesh around the solid-fluid interface defined by a level set function. Using homogenisation techniques, the permeability of a representative elementary volume (REV) is computed. The computed permeabilities of regular fibre packings are compared to classical analytical relations found in the bibliography.

  7. HEALTH GeoJunction: place-time-concept browsing of health publications.

    Science.gov (United States)

    MacEachren, Alan M; Stryker, Michael S; Turton, Ian J; Pezanowski, Scott

    2010-05-18

    The volume of health science publications is escalating rapidly. Thus, keeping up with developments is becoming harder as is the task of finding important cross-domain connections. When geographic location is a relevant component of research reported in publications, these tasks are more difficult because standard search and indexing facilities have limited or no ability to identify geographic foci in documents. This paper introduces HEALTH GeoJunction, a web application that supports researchers in the task of quickly finding scientific publications that are relevant geographically and temporally as well as thematically. HEALTH GeoJunction is a geovisual analytics-enabled web application providing: (a) web services using computational reasoning methods to extract place-time-concept information from bibliographic data for documents and (b) visually-enabled place-time-concept query, filtering, and contextualizing tools that apply to both the documents and their extracted content. This paper focuses specifically on strategies for visually-enabled, iterative, facet-like, place-time-concept filtering that allows analysts to quickly drill down to scientific findings of interest in PubMed abstracts and to explore relations among abstracts and extracted concepts in place and time. The approach enables analysts to: find publications without knowing all relevant query parameters, recognize unanticipated geographic relations within and among documents in multiple health domains, identify the thematic emphasis of research targeting particular places, notice changes in concepts over time, and notice changes in places where concepts are emphasized. PubMed is a database of over 19 million biomedical abstracts and citations maintained by the National Center for Biotechnology Information; achieving quick filtering is an important contribution due to the database size. Including geography in filters is important due to rapidly escalating attention to geographic factors in public

  8. HEALTH GeoJunction: place-time-concept browsing of health publications

    Directory of Open Access Journals (Sweden)

    Turton Ian J

    2010-05-01

    Full Text Available Abstract Background The volume of health science publications is escalating rapidly. Thus, keeping up with developments is becoming harder as is the task of finding important cross-domain connections. When geographic location is a relevant component of research reported in publications, these tasks are more difficult because standard search and indexing facilities have limited or no ability to identify geographic foci in documents. This paper introduces HEALTH GeoJunction, a web application that supports researchers in the task of quickly finding scientific publications that are relevant geographically and temporally as well as thematically. Results HEALTH GeoJunction is a geovisual analytics-enabled web application providing: (a web services using computational reasoning methods to extract place-time-concept information from bibliographic data for documents and (b visually-enabled place-time-concept query, filtering, and contextualizing tools that apply to both the documents and their extracted content. This paper focuses specifically on strategies for visually-enabled, iterative, facet-like, place-time-concept filtering that allows analysts to quickly drill down to scientific findings of interest in PubMed abstracts and to explore relations among abstracts and extracted concepts in place and time. The approach enables analysts to: find publications without knowing all relevant query parameters, recognize unanticipated geographic relations within and among documents in multiple health domains, identify the thematic emphasis of research targeting particular places, notice changes in concepts over time, and notice changes in places where concepts are emphasized. Conclusions PubMed is a database of over 19 million biomedical abstracts and citations maintained by the National Center for Biotechnology Information; achieving quick filtering is an important contribution due to the database size. Including geography in filters is important due to

  9. Statistical CT noise reduction with multiscale decomposition and penalized weighted least squares in the projection domain

    International Nuclear Information System (INIS)

    Tang Shaojie; Tang Xiangyang

    2012-01-01

    Purposes: The suppression of noise in x-ray computed tomography (CT) imaging is of clinical relevance for diagnostic image quality and the potential for radiation dose saving. Toward this purpose, statistical noise reduction methods in either the image or projection domain have been proposed, which employ a multiscale decomposition to enhance the performance of noise suppression while maintaining image sharpness. Recognizing the advantages of noise suppression in the projection domain, the authors propose a projection domain multiscale penalized weighted least squares (PWLS) method, in which the angular sampling rate is explicitly taken into consideration to account for the possible variation of interview sampling rate in advanced clinical or preclinical applications. Methods: The projection domain multiscale PWLS method is derived by converting an isotropic diffusion partial differential equation in the image domain into the projection domain, wherein a multiscale decomposition is carried out. With adoption of the Markov random field or soft thresholding objective function, the projection domain multiscale PWLS method deals with noise at each scale. To compensate for the degradation in image sharpness caused by the projection domain multiscale PWLS method, an edge enhancement is carried out following the noise reduction. The performance of the proposed method is experimentally evaluated and verified using the projection data simulated by computer and acquired by a CT scanner. Results: The preliminary results show that the proposed projection domain multiscale PWLS method outperforms the projection domain single-scale PWLS method and the image domain multiscale anisotropic diffusion method in noise reduction. In addition, the proposed method can preserve image sharpness very well while the occurrence of “salt-and-pepper” noise and mosaic artifacts can be avoided. Conclusions: Since the interview sampling rate is taken into account in the projection domain

  10. Alpha particle effects as a test domain for PAP, a Plasma Apprentice Program

    International Nuclear Information System (INIS)

    Mynick, H.E.

    1987-01-01

    A new type of computational tool under development, employing techniques of symbolic computation and artificial intelligence to automate as far as possible the research activities of a human plasma theorist, is described. Its present and potential uses are illustrated using the area of the theory of alpha particle effects in fusion plasmas as a sample domain. (orig.)

  11. Public scientific communication: reflections on the public and its participation forms

    Directory of Open Access Journals (Sweden)

    Peter Sekloča

    2009-09-01

    Full Text Available Scientific communication also pertains to the domain of society, where the formation of public opinion about science and technology is taking place. Concerning this process, two main points are exposed in the commentary. The first is a proposition on how the public as a social category may be conceptualized, and the second is the extent of the participation of members of the public in strengthening socialization and democratization practices in new, highly complex, contexts of scientific research. The public is conceptualized to include all citizens no matter their professional origin, including scientists, which promotes the idea of openness and equality of the public sphere where scientific issues are discussed. To be democratic in its practical-political setting, such a conception needs to deal with the problems of participation in a highly mediatized world, where not every member of the public could be included into scientific research. The author thus reflects on the mechanisms which would enable the formation of public forums where the trust of influential public actors as stakeholders of research can be tested.

  12. A study of Computing doctorates in South Africa from 1978 to 2014

    Directory of Open Access Journals (Sweden)

    Ian D Sanders

    2015-12-01

    Full Text Available This paper studies the output of South African universities in terms of computing-related doctorates in order to determine trends in numbers of doctorates awarded and to identify strong doctoral study research areas. Data collected from a variety of sources relating to Computing doctorates conferred since the late 1970s was used to compare the situation in Computing with that of all doctorates. The number of Computing doctorates awarded has increased considerably over the period of study. Nearly three times as many doctorates were awarded in the period 2010–2014 as in 2000–2004. The universities producing the most Computing doctorates were either previously “traditional” universities or comprehensive universities formed by amalgamating a traditional research university with a technikon. Universities of technology have not yet produced many doctorates as they do not have a strong research tradition. The analysis of topic keywords using ACM Computing classifications is preliminary but shows that professional issues are dominant in Information Systems, models are often built in Computer Science and several topics, including computing in education, are evident in both IS and CS. The relevant data is in the public domain but access is difficult as record keeping was generally inconsistent and incomplete. In addition, electronic databases at universities are not easily searchable and access to HEMIS data is limited. The database built for this paper is more inclusive in terms of discipline-related data than others.

  13. Closed-form estimates of the domain of attraction for nonlinear systems via fuzzy-polynomial models.

    Science.gov (United States)

    Pitarch, José Luis; Sala, Antonio; Ariño, Carlos Vicente

    2014-04-01

    In this paper, the domain of attraction of the origin of a nonlinear system is estimated in closed form via level sets with polynomial boundaries, iteratively computed. In particular, the domain of attraction is expanded from a previous estimate, such as a classical Lyapunov level set. With the use of fuzzy-polynomial models, the domain of attraction analysis can be carried out via sum of squares optimization and an iterative algorithm. The result is a function that bounds the domain of attraction, free from the usual restriction of being positive and decrescent in all the interior of its level sets.

  14. The BRCT domain is a phospho-protein binding domain.

    Science.gov (United States)

    Yu, Xiaochun; Chini, Claudia Christiano Silva; He, Miao; Mer, Georges; Chen, Junjie

    2003-10-24

    The carboxyl-terminal domain (BRCT) of the Breast Cancer Gene 1 (BRCA1) protein is an evolutionarily conserved module that exists in a large number of proteins from prokaryotes to eukaryotes. Although most BRCT domain-containing proteins participate in DNA-damage checkpoint or DNA-repair pathways, or both, the function of the BRCT domain is not fully understood. We show that the BRCA1 BRCT domain directly interacts with phosphorylated BRCA1-Associated Carboxyl-terminal Helicase (BACH1). This specific interaction between BRCA1 and phosphorylated BACH1 is cell cycle regulated and is required for DNA damage-induced checkpoint control during the transition from G2 to M phase of the cell cycle. Further, we show that two other BRCT domains interact with their respective physiological partners in a phosphorylation-dependent manner. Thirteen additional BRCT domains also preferentially bind phospho-peptides rather than nonphosphorylated control peptides. These data imply that the BRCT domain is a phospho-protein binding domain involved in cell cycle control.

  15. The effects of computer-based mindfulness training on Self-control and Mindfulness within Ambulatorily assessed network Systems across Health-related domains in a healthy student population (SMASH): study protocol for a randomized controlled trial.

    Science.gov (United States)

    Rowland, Zarah; Wenzel, Mario; Kubiak, Thomas

    2016-12-01

    Self-control is an important ability in everyday life, showing associations with health-related outcomes. The aim of the Self-control and Mindfulness within Ambulatorily assessed network Systems across Health-related domains (SMASH) study is twofold: first, the effectiveness of a computer-based mindfulness training will be evaluated in a randomized controlled trial. Second, the SMASH study implements a novel network approach in order to investigate complex temporal interdependencies of self-control networks across several domains. The SMASH study is a two-armed, 6-week, non-blinded randomized controlled trial that combines seven weekly laboratory meetings and 40 days of electronic diary assessments with six prompts per day in a healthy undergraduate student population at the Johannes Gutenberg University Mainz, Germany. Participants will be randomly assigned to (1) receive a computer-based mindfulness intervention or (2) to a wait-list control condition. Primary outcomes are self-reported momentary mindfulness and self-control assessed via electronic diaries. Secondary outcomes are habitual mindfulness and habitual self-control. Further measures include self-reported behaviors in specific self-control domains: emotion regulation, alcohol consumption and eating behaviors. The effects of mindfulness training on primary and secondary outcomes are explored using three-level mixed models. Furthermore, networks will be computed with vector autoregressive mixed models to investigate the dynamics at participant and group level. This study was approved by the local ethics committee (reference code 2015_JGU_psychEK_011) and follows the standards laid down in the Declaration of Helsinki (2013). This randomized controlled trial combines an intensive Ambulatory Assessment of 40 consecutive days and seven laboratory meetings. By implementing a novel network approach, underlying processes of self-control within different health domains will be identified. These results will

  16. Structured Literature Review of disruptive innovation theory within the digital domain

    DEFF Research Database (Denmark)

    Vesti, Helle; Nielsen, Christian; Rosenstand, Claus Andreas Foss

    2017-01-01

    The area of interest is disruption is the digital domain. The research questions are: How has the disruption and digital disruption literature developed over time? What is the research focus into disruption regarding the digital domain and how has this changed over time? Which methods are being...... utilized in research regarding disruption and digital disruption? Where are the key contributors to disruption in general and in digital disruption? Is there a future for digital disruption research? The method is a Structured Literature Review (SLR). The contribution is the results of an analysis of 95...... publications within the field of disruption in the digital domain and disruptive innovation theory in general. Works of twelve practitioners and 83 academics are investigated....

  17. A Parallel Non-Overlapping Domain-Decomposition Algorithm for Compressible Fluid Flow Problems on Triangulated Domains

    Science.gov (United States)

    Barth, Timothy J.; Chan, Tony F.; Tang, Wei-Pai

    1998-01-01

    This paper considers an algebraic preconditioning algorithm for hyperbolic-elliptic fluid flow problems. The algorithm is based on a parallel non-overlapping Schur complement domain-decomposition technique for triangulated domains. In the Schur complement technique, the triangulation is first partitioned into a number of non-overlapping subdomains and interfaces. This suggests a reordering of triangulation vertices which separates subdomain and interface solution unknowns. The reordering induces a natural 2 x 2 block partitioning of the discretization matrix. Exact LU factorization of this block system yields a Schur complement matrix which couples subdomains and the interface together. The remaining sections of this paper present a family of approximate techniques for both constructing and applying the Schur complement as a domain-decomposition preconditioner. The approximate Schur complement serves as an algebraic coarse space operator, thus avoiding the known difficulties associated with the direct formation of a coarse space discretization. In developing Schur complement approximations, particular attention has been given to improving sequential and parallel efficiency of implementations without significantly degrading the quality of the preconditioner. A computer code based on these developments has been tested on the IBM SP2 using MPI message passing protocol. A number of 2-D calculations are presented for both scalar advection-diffusion equations as well as the Euler equations governing compressible fluid flow to demonstrate performance of the preconditioning algorithm.

  18. Frequency domain finite-element and spectral-element acoustic wave modeling using absorbing boundaries and perfectly matched layer

    Science.gov (United States)

    Rahimi Dalkhani, Amin; Javaherian, Abdolrahim; Mahdavi Basir, Hadi

    2018-04-01

    Wave propagation modeling as a vital tool in seismology can be done via several different numerical methods among them are finite-difference, finite-element, and spectral-element methods (FDM, FEM and SEM). Some advanced applications in seismic exploration benefit the frequency domain modeling. Regarding flexibility in complex geological models and dealing with the free surface boundary condition, we studied the frequency domain acoustic wave equation using FEM and SEM. The results demonstrated that the frequency domain FEM and SEM have a good accuracy and numerical efficiency with the second order interpolation polynomials. Furthermore, we developed the second order Clayton and Engquist absorbing boundary condition (CE-ABC2) and compared it with the perfectly matched layer (PML) for the frequency domain FEM and SEM. In spite of PML method, CE-ABC2 does not add any additional computational cost to the modeling except assembling boundary matrices. As a result, considering CE-ABC2 is more efficient than PML for the frequency domain acoustic wave propagation modeling especially when computational cost is high and high-level absorbing performance is unnecessary.

  19. Volunteer Computing for Science Gateways

    OpenAIRE

    Anderson, David

    2017-01-01

    This poster offers information about volunteer computing for science gateways that offer high-throughput computing services. Volunteer computing can be used to get computing power. This increases the visibility of the gateway to the general public as well as increasing computing capacity at little cost.

  20. Understanding disciplinary vocabularies using a full-text enabled domain-independent term extraction approach.

    Science.gov (United States)

    Yan, Erjia; Williams, Jake; Chen, Zheng

    2017-01-01

    Publication metadata help deliver rich analyses of scholarly communication. However, research concepts and ideas are more effectively expressed through unstructured fields such as full texts. Thus, the goals of this paper are to employ a full-text enabled method to extract terms relevant to disciplinary vocabularies, and through them, to understand the relationships between disciplines. This paper uses an efficient, domain-independent term extraction method to extract disciplinary vocabularies from a large multidisciplinary corpus of PLoS ONE publications. It finds a power-law pattern in the frequency distributions of terms present in each discipline, indicating a semantic richness potentially sufficient for further study and advanced analysis. The salient relationships amongst these vocabularies become apparent in application of a principal component analysis. For example, Mathematics and Computer and Information Sciences were found to have similar vocabulary use patterns along with Engineering and Physics; while Chemistry and the Social Sciences were found to exhibit contrasting vocabulary use patterns along with the Earth Sciences and Chemistry. These results have implications to studies of scholarly communication as scholars attempt to identify the epistemological cultures of disciplines, and as a full text-based methodology could lead to machine learning applications in the automated classification of scholarly work according to disciplinary vocabularies.

  1. Gimme Context – towards New Domain-Specific Collocational Dictionaries

    Directory of Open Access Journals (Sweden)

    Sylvana Krausse

    2011-04-01

    Full Text Available The days of traditional drudgery-filled lexicography are long gone. Fortunately today, computers help in the enormous task of storing and analysing language in order to condense and store the found information in the form of dictionaries. In this paper, the way from a corpus to a small domain-specific collocational dictionary will be described and thus exemplified based on the example of the domain-specific language of mining reclamation, which can be duplicated for other specific languages too. So far, domain-specific dictionaries are mostly rare as their creation is very labour- and thus cost-effective and all too often they are just a collection of terms plus translation without any information on how to use them in speech. Particular small domains which do not involve a lot of users have been disregarded by lexicographers as there is also always the question of how well it sells afterwards. Following this, I will describe the creation of a small collocational dictionary on mining reclamation language which is based on the consequent use of corpus information. It is relatively quick to realize in the design phase and is thought to provide the sort of linguistic information engineering experts need when they communicate in English or read specialist texts in the specific domain.

  2. Computational analysis and prediction of the binding motif and protein interacting partners of the Abl SH3 domain.

    Directory of Open Access Journals (Sweden)

    Tingjun Hou

    2006-01-01

    Full Text Available Protein-protein interactions, particularly weak and transient ones, are often mediated by peptide recognition domains, such as Src Homology 2 and 3 (SH2 and SH3 domains, which bind to specific sequence and structural motifs. It is important but challenging to determine the binding specificity of these domains accurately and to predict their physiological interacting partners. In this study, the interactions between 35 peptide ligands (15 binders and 20 non-binders and the Abl SH3 domain were analyzed using molecular dynamics simulation and the Molecular Mechanics/Poisson-Boltzmann Solvent Area method. The calculated binding free energies correlated well with the rank order of the binding peptides and clearly distinguished binders from non-binders. Free energy component analysis revealed that the van der Waals interactions dictate the binding strength of peptides, whereas the binding specificity is determined by the electrostatic interaction and the polar contribution of desolvation. The binding motif of the Abl SH3 domain was then determined by a virtual mutagenesis method, which mutates the residue at each position of the template peptide relative to all other 19 amino acids and calculates the binding free energy difference between the template and the mutated peptides using the Molecular Mechanics/Poisson-Boltzmann Solvent Area method. A single position mutation free energy profile was thus established and used as a scoring matrix to search peptides recognized by the Abl SH3 domain in the human genome. Our approach successfully picked ten out of 13 experimentally determined binding partners of the Abl SH3 domain among the top 600 candidates from the 218,540 decapeptides with the PXXP motif in the SWISS-PROT database. We expect that this physical-principle based method can be applied to other protein domains as well.

  3. A cyclostationary multi-domain analysis of fluid instability in Kaplan turbines

    Science.gov (United States)

    Pennacchi, P.; Borghesani, P.; Chatterton, S.

    2015-08-01

    Hydraulic instabilities represent a critical problem for Francis and Kaplan turbines, reducing their useful life due to increase of fatigue on the components and cavitation phenomena. Whereas an exhaustive list of publications on computational fluid-dynamic models of hydraulic instability is available, the possibility of applying diagnostic techniques based on vibration measurements has not been investigated sufficiently, also because the appropriate sensors seldom equip hydro turbine units. The aim of this study is to fill this knowledge gap and to exploit fully, for this purpose, the potentiality of combining cyclostationary analysis tools, able to describe complex dynamics such as those of fluid-structure interactions, with order tracking procedures, allowing domain transformations and consequently the separation of synchronous and non-synchronous components. This paper will focus on experimental data obtained on a full-scale Kaplan turbine unit, operating in a real power plant, tackling the issues of adapting such diagnostic tools for the analysis of hydraulic instabilities and proposing techniques and methodologies for a highly automated condition monitoring system.

  4. Designing Educational Games for Computer Programming: A Holistic Framework

    Science.gov (United States)

    Malliarakis, Christos; Satratzemi, Maya; Xinogalos, Stelios

    2014-01-01

    Computer science is continuously evolving during the past decades. This has also brought forth new knowledge that should be incorporated and new learning strategies must be adopted for the successful teaching of all sub-domains. For example, computer programming is a vital knowledge area within computer science with constantly changing curriculum…

  5. Domain analysis

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2017-01-01

    The domain-analytic approach to knowledge organization (KO) (and to the broader field of library and information science, LIS) is outlined. The article reviews the discussions and proposals on the definition of domains, and provides an example of a domain-analytic study in the field of art studies....... Varieties of domain analysis as well as criticism and controversies are presented and discussed....

  6. Massively parallel evolutionary computation on GPGPUs

    CERN Document Server

    Tsutsui, Shigeyoshi

    2013-01-01

    Evolutionary algorithms (EAs) are metaheuristics that learn from natural collective behavior and are applied to solve optimization problems in domains such as scheduling, engineering, bioinformatics, and finance. Such applications demand acceptable solutions with high-speed execution using finite computational resources. Therefore, there have been many attempts to develop platforms for running parallel EAs using multicore machines, massively parallel cluster machines, or grid computing environments. Recent advances in general-purpose computing on graphics processing units (GPGPU) have opened u

  7. Fast time- and frequency-domain finite-element methods for electromagnetic analysis

    Science.gov (United States)

    Lee, Woochan

    Fast electromagnetic analysis in time and frequency domain is of critical importance to the design of integrated circuits (IC) and other advanced engineering products and systems. Many IC structures constitute a very large scale problem in modeling and simulation, the size of which also continuously grows with the advancement of the processing technology. This results in numerical problems beyond the reach of existing most powerful computational resources. Different from many other engineering problems, the structure of most ICs is special in the sense that its geometry is of Manhattan type and its dielectrics are layered. Hence, it is important to develop structure-aware algorithms that take advantage of the structure specialties to speed up the computation. In addition, among existing time-domain methods, explicit methods can avoid solving a matrix equation. However, their time step is traditionally restricted by the space step for ensuring the stability of a time-domain simulation. Therefore, making explicit time-domain methods unconditionally stable is important to accelerate the computation. In addition to time-domain methods, frequency-domain methods have suffered from an indefinite system that makes an iterative solution difficult to converge fast. The first contribution of this work is a fast time-domain finite-element algorithm for the analysis and design of very large-scale on-chip circuits. The structure specialty of on-chip circuits such as Manhattan geometry and layered permittivity is preserved in the proposed algorithm. As a result, the large-scale matrix solution encountered in the 3-D circuit analysis is turned into a simple scaling of the solution of a small 1-D matrix, which can be obtained in linear (optimal) complexity with negligible cost. Furthermore, the time step size is not sacrificed, and the total number of time steps to be simulated is also significantly reduced, thus achieving a total cost reduction in CPU time. The second contribution

  8. Plasmonic computing of spatial differentiation

    Science.gov (United States)

    Zhu, Tengfeng; Zhou, Yihan; Lou, Yijie; Ye, Hui; Qiu, Min; Ruan, Zhichao; Fan, Shanhui

    2017-05-01

    Optical analog computing offers high-throughput low-power-consumption operation for specialized computational tasks. Traditionally, optical analog computing in the spatial domain uses a bulky system of lenses and filters. Recent developments in metamaterials enable the miniaturization of such computing elements down to a subwavelength scale. However, the required metamaterial consists of a complex array of meta-atoms, and direct demonstration of image processing is challenging. Here, we show that the interference effects associated with surface plasmon excitations at a single metal-dielectric interface can perform spatial differentiation. And we experimentally demonstrate edge detection of an image without any Fourier lens. This work points to a simple yet powerful mechanism for optical analog computing at the nanoscale.

  9. Plasmonic computing of spatial differentiation.

    Science.gov (United States)

    Zhu, Tengfeng; Zhou, Yihan; Lou, Yijie; Ye, Hui; Qiu, Min; Ruan, Zhichao; Fan, Shanhui

    2017-05-19

    Optical analog computing offers high-throughput low-power-consumption operation for specialized computational tasks. Traditionally, optical analog computing in the spatial domain uses a bulky system of lenses and filters. Recent developments in metamaterials enable the miniaturization of such computing elements down to a subwavelength scale. However, the required metamaterial consists of a complex array of meta-atoms, and direct demonstration of image processing is challenging. Here, we show that the interference effects associated with surface plasmon excitations at a single metal-dielectric interface can perform spatial differentiation. And we experimentally demonstrate edge detection of an image without any Fourier lens. This work points to a simple yet powerful mechanism for optical analog computing at the nanoscale.

  10. High-performance parallel computing in the classroom using the public goods game as an example

    Science.gov (United States)

    Perc, Matjaž

    2017-07-01

    The use of computers in statistical physics is common because the sheer number of equations that describe the behaviour of an entire system particle by particle often makes it impossible to solve them exactly. Monte Carlo methods form a particularly important class of numerical methods for solving problems in statistical physics. Although these methods are simple in principle, their proper use requires a good command of statistical mechanics, as well as considerable computational resources. The aim of this paper is to demonstrate how the usage of widely accessible graphics cards on personal computers can elevate the computing power in Monte Carlo simulations by orders of magnitude, thus allowing live classroom demonstration of phenomena that would otherwise be out of reach. As an example, we use the public goods game on a square lattice where two strategies compete for common resources in a social dilemma situation. We show that the second-order phase transition to an absorbing phase in the system belongs to the directed percolation universality class, and we compare the time needed to arrive at this result by means of the main processor and by means of a suitable graphics card. Parallel computing on graphics processing units has been developed actively during the last decade, to the point where today the learning curve for entry is anything but steep for those familiar with programming. The subject is thus ripe for inclusion in graduate and advanced undergraduate curricula, and we hope that this paper will facilitate this process in the realm of physics education. To that end, we provide a documented source code for an easy reproduction of presented results and for further development of Monte Carlo simulations of similar systems.

  11. Inferring domain-domain interactions from protein-protein interactions with formal concept analysis.

    Directory of Open Access Journals (Sweden)

    Susan Khor

    Full Text Available Identifying reliable domain-domain interactions will increase our ability to predict novel protein-protein interactions, to unravel interactions in protein complexes, and thus gain more information about the function and behavior of genes. One of the challenges of identifying reliable domain-domain interactions is domain promiscuity. Promiscuous domains are domains that can occur in many domain architectures and are therefore found in many proteins. This becomes a problem for a method where the score of a domain-pair is the ratio between observed and expected frequencies because the protein-protein interaction network is sparse. As such, many protein-pairs will be non-interacting and domain-pairs with promiscuous domains will be penalized. This domain promiscuity challenge to the problem of inferring reliable domain-domain interactions from protein-protein interactions has been recognized, and a number of work-arounds have been proposed. This paper reports on an application of Formal Concept Analysis to this problem. It is found that the relationship between formal concepts provides a natural way for rare domains to elevate the rank of promiscuous domain-pairs and enrich highly ranked domain-pairs with reliable domain-domain interactions. This piggybacking of promiscuous domain-pairs onto less promiscuous domain-pairs is possible only with concept lattices whose attribute-labels are not reduced and is enhanced by the presence of proteins that comprise both promiscuous and rare domains.

  12. Inferring Domain-Domain Interactions from Protein-Protein Interactions with Formal Concept Analysis

    Science.gov (United States)

    Khor, Susan

    2014-01-01

    Identifying reliable domain-domain interactions will increase our ability to predict novel protein-protein interactions, to unravel interactions in protein complexes, and thus gain more information about the function and behavior of genes. One of the challenges of identifying reliable domain-domain interactions is domain promiscuity. Promiscuous domains are domains that can occur in many domain architectures and are therefore found in many proteins. This becomes a problem for a method where the score of a domain-pair is the ratio between observed and expected frequencies because the protein-protein interaction network is sparse. As such, many protein-pairs will be non-interacting and domain-pairs with promiscuous domains will be penalized. This domain promiscuity challenge to the problem of inferring reliable domain-domain interactions from protein-protein interactions has been recognized, and a number of work-arounds have been proposed. This paper reports on an application of Formal Concept Analysis to this problem. It is found that the relationship between formal concepts provides a natural way for rare domains to elevate the rank of promiscuous domain-pairs and enrich highly ranked domain-pairs with reliable domain-domain interactions. This piggybacking of promiscuous domain-pairs onto less promiscuous domain-pairs is possible only with concept lattices whose attribute-labels are not reduced and is enhanced by the presence of proteins that comprise both promiscuous and rare domains. PMID:24586450

  13. The Public Use Limitation in Eminent Domain: "Handley v. Cook."

    Science.gov (United States)

    Grill, Donna P.

    1979-01-01

    It is time for the courts to rigorously scrutinize allegations of public use in order to protect the property rights of private individuals. Available from West Virginia Law Review, W.V.U. Law Center, Morgantown, WV 26506. (Author)

  14. A semantic-based method for extracting concept definitions from scientific publications: evaluation in the autism phenotype domain.

    Science.gov (United States)

    Hassanpour, Saeed; O'Connor, Martin J; Das, Amar K

    2013-08-12

    A variety of informatics approaches have been developed that use information retrieval, NLP and text-mining techniques to identify biomedical concepts and relations within scientific publications or their sentences. These approaches have not typically addressed the challenge of extracting more complex knowledge such as biomedical definitions. In our efforts to facilitate knowledge acquisition of rule-based definitions of autism phenotypes, we have developed a novel semantic-based text-mining approach that can automatically identify such definitions within text. Using an existing knowledge base of 156 autism phenotype definitions and an annotated corpus of 26 source articles containing such definitions, we evaluated and compared the average rank of correctly identified rule definition or corresponding rule template using both our semantic-based approach and a standard term-based approach. We examined three separate scenarios: (1) the snippet of text contained a definition already in the knowledge base; (2) the snippet contained an alternative definition for a concept in the knowledge base; and (3) the snippet contained a definition not in the knowledge base. Our semantic-based approach had a higher average rank than the term-based approach for each of the three scenarios (scenario 1: 3.8 vs. 5.0; scenario 2: 2.8 vs. 4.9; and scenario 3: 4.5 vs. 6.2), with each comparison significant at the p-value of 0.05 using the Wilcoxon signed-rank test. Our work shows that leveraging existing domain knowledge in the information extraction of biomedical definitions significantly improves the correct identification of such knowledge within sentences. Our method can thus help researchers rapidly acquire knowledge about biomedical definitions that are specified and evolving within an ever-growing corpus of scientific publications.

  15. Long-chain GM1 gangliosides alter transmembrane domain registration through interdigitation

    Czech Academy of Sciences Publication Activity Database

    Manna, M.; Javanainen, M.; Martinez-Seara Monne, Hector; Gabius, H. J.; Rog, T.; Vattulainen, I.

    2017-01-01

    Roč. 1859, č. 5 (2017), s. 870-878 ISSN 0005-2736 Institutional support: RVO:61388963 Keywords : glycosphingolipid * cholesterol * membrane domain * membrane registry * molecular dynamics * computer simulations Subject RIV: CF - Physical ; Theoretical Chemistry OBOR OECD: Physical chemistry Impact factor: 3.498, year: 2016

  16. Convergence of Corporate and Public Governance

    OpenAIRE

    Gérard Hirigoyen; Radhoine Laouer

    2013-01-01

    By analyzing the differences between the corporate and public governance, theoretical and empirical research seems to indicate that the two domains of governance are far too different to share any common aspect. However, in this particular research, it has been argued that public governance is an application of corporate governance. Thus, the research question entails the description and analysis of this possible conve...

  17. Integrating publicly-available data to generate computationally ...

    Science.gov (United States)

    The adverse outcome pathway (AOP) framework provides a way of organizing knowledge related to the key biological events that result in a particular health outcome. For the majority of environmental chemicals, the availability of curated pathways characterizing potential toxicity is limited. Methods are needed to assimilate large amounts of available molecular data and quickly generate putative AOPs for further testing and use in hazard assessment. A graph-based workflow was used to facilitate the integration of multiple data types to generate computationally-predicted (cp) AOPs. Edges between graph entities were identified through direct experimental or literature information or computationally inferred using frequent itemset mining. Data from the TG-GATEs and ToxCast programs were used to channel large-scale toxicogenomics information into a cpAOP network (cpAOPnet) of over 20,000 relationships describing connections between chemical treatments, phenotypes, and perturbed pathways measured by differential gene expression and high-throughput screening targets. Sub-networks of cpAOPs for a reference chemical (carbon tetrachloride, CCl4) and outcome (hepatic steatosis) were extracted using the network topology. Comparison of the cpAOP subnetworks to published mechanistic descriptions for both CCl4 toxicity and hepatic steatosis demonstrate that computational approaches can be used to replicate manually curated AOPs and identify pathway targets that lack genomic mar

  18. Reduced order for nuclear reactor model in frequency and time domain

    International Nuclear Information System (INIS)

    Nugroho, D.H.

    1997-01-01

    In control system theory, a model can be represented by frequency or time domain. In frequency domain, the model was represented by transfer function. in time domain, the model was represented by state space. for the sake of simplification in computation, it is necessary to reduce the model order. the main aim of this research is to find the best in nuclear reactor model. Model order reduction in frequency domain can be done utilizing pole-zero cancellation method; while in time domain utilizing balanced aggregation method the balanced aggregation method was developed by moore (1981). In this paper, the two kinds of method were applied to reduce a nuclear reactor model which was constructed by neutron dynamics and heat transfer equations. to validate that the model characteristics were not change when model order reduction applied, the response was utilized for full and reduced order. it was shown that the nuclear reactor order model can be reduced from order 8 to 2 order 2 is the best order for nuclear reactor model

  19. Computer systems and nuclear industry

    International Nuclear Information System (INIS)

    Nkaoua, Th.; Poizat, F.; Augueres, M.J.

    1999-01-01

    This article deals with computer systems in nuclear industry. In most nuclear facilities it is necessary to handle a great deal of data and of actions in order to help plant operator to drive, to control physical processes and to assure the safety. The designing of reactors requires reliable computer codes able to simulate neutronic or mechanical or thermo-hydraulic behaviours. Calculations and simulations play an important role in safety analysis. In each of these domains, computer systems have progressively appeared as efficient tools to challenge and master complexity. (A.C.)

  20. The Jupyter/IPython architecture: a unified view of computational research, from interactive exploration to communication and publication.

    Science.gov (United States)

    Ragan-Kelley, M.; Perez, F.; Granger, B.; Kluyver, T.; Ivanov, P.; Frederic, J.; Bussonnier, M.

    2014-12-01

    IPython has provided terminal-based tools for interactive computing in Python since 2001. The notebook document format and multi-process architecture introduced in 2011 have expanded the applicable scope of IPython into teaching, presenting, and sharing computational work, in addition to interactive exploration. The new architecture also allows users to work in any language, with implementations in Python, R, Julia, Haskell, and several other languages. The language agnostic parts of IPython have been renamed to Jupyter, to better capture the notion that a cross-language design can encapsulate commonalities present in computational research regardless of the programming language being used. This architecture offers components like the web-based Notebook interface, that supports rich documents that combine code and computational results with text narratives, mathematics, images, video and any media that a modern browser can display. This interface can be used not only in research, but also for publication and education, as notebooks can be converted to a variety of output formats, including HTML and PDF. Recent developments in the Jupyter project include a multi-user environment for hosting notebooks for a class or research group, a live collaboration notebook via Google Docs, and better support for languages other than Python.

  1. A High Performance VLSI Computer Architecture For Computer Graphics

    Science.gov (United States)

    Chin, Chi-Yuan; Lin, Wen-Tai

    1988-10-01

    A VLSI computer architecture, consisting of multiple processors, is presented in this paper to satisfy the modern computer graphics demands, e.g. high resolution, realistic animation, real-time display etc.. All processors share a global memory which are partitioned into multiple banks. Through a crossbar network, data from one memory bank can be broadcasted to many processors. Processors are physically interconnected through a hyper-crossbar network (a crossbar-like network). By programming the network, the topology of communication links among processors can be reconfigurated to satisfy specific dataflows of different applications. Each processor consists of a controller, arithmetic operators, local memory, a local crossbar network, and I/O ports to communicate with other processors, memory banks, and a system controller. Operations in each processor are characterized into two modes, i.e. object domain and space domain, to fully utilize the data-independency characteristics of graphics processing. Special graphics features such as 3D-to-2D conversion, shadow generation, texturing, and reflection, can be easily handled. With the current high density interconnection (MI) technology, it is feasible to implement a 64-processor system to achieve 2.5 billion operations per second, a performance needed in most advanced graphics applications.

  2. Adaptation and hybridization in computational intelligence

    CERN Document Server

    Jr, Iztok

    2015-01-01

      This carefully edited book takes a walk through recent advances in adaptation and hybridization in the Computational Intelligence (CI) domain. It consists of ten chapters that are divided into three parts. The first part illustrates background information and provides some theoretical foundation tackling the CI domain, the second part deals with the adaptation in CI algorithms, while the third part focuses on the hybridization in CI. This book can serve as an ideal reference for researchers and students of computer science, electrical and civil engineering, economy, and natural sciences that are confronted with solving the optimization, modeling and simulation problems. It covers the recent advances in CI that encompass Nature-inspired algorithms, like Artificial Neural networks, Evolutionary Algorithms and Swarm Intelligence –based algorithms.  

  3. Computer Vision Syndrome.

    Science.gov (United States)

    Randolph, Susan A

    2017-07-01

    With the increased use of electronic devices with visual displays, computer vision syndrome is becoming a major public health issue. Improving the visual status of workers using computers results in greater productivity in the workplace and improved visual comfort.

  4. Determination of point of maximum likelihood in failure domain using genetic algorithms

    International Nuclear Information System (INIS)

    Obadage, A.S.; Harnpornchai, N.

    2006-01-01

    The point of maximum likelihood in a failure domain yields the highest value of the probability density function in the failure domain. The maximum-likelihood point thus represents the worst combination of random variables that contribute in the failure event. In this work Genetic Algorithms (GAs) with an adaptive penalty scheme have been proposed as a tool for the determination of the maximum likelihood point. The utilization of only numerical values in the GAs operation makes the algorithms applicable to cases of non-linear and implicit single and multiple limit state function(s). The algorithmic simplicity readily extends its application to higher dimensional problems. When combined with Monte Carlo Simulation, the proposed methodology will reduce the computational complexity and at the same time will enhance the possibility in rare-event analysis under limited computational resources. Since, there is no approximation done in the procedure, the solution obtained is considered accurate. Consequently, GAs can be used as a tool for increasing the computational efficiency in the element and system reliability analyses

  5. Survey of Energy Computing in the Smart Grid Domain

    OpenAIRE

    Rajesh Kumar; Arun Agarwala

    2013-01-01

    Resource optimization, with advance computing tools, improves the efficient use of energy resources. The renewable energy resources are instantaneous and needs to be conserve at the same time. To optimize real time process, the complex design, includes plan of resources and control for effective utilization. The advances in information communication technology tools enables data formatting and analysis results in optimization of use the renewable resources for sustainable energy solution on s...

  6. Computational Biology and the Limits of Shared Vision

    DEFF Research Database (Denmark)

    Carusi, Annamaria

    2011-01-01

    of cases is necessary in order to gain a better perspective on social sharing of practices, and on what other factors this sharing is dependent upon. The article presents the case of currently emerging inter-disciplinary visual practices in the domain of computational biology, where the sharing of visual...... practices would be beneficial to the collaborations necessary for the research. Computational biology includes sub-domains where visual practices are coming to be shared across disciplines, and those where this is not occurring, and where the practices of others are resisted. A significant point......, its domain of study. Social practices alone are not sufficient to account for the shaping of evidence. The philosophy of Merleau-Ponty is introduced as providing an alternative framework for thinking of the complex inter-relations between all of these factors. This [End Page 300] philosophy enables us...

  7. Multiple Shooting and Time Domain Decomposition Methods

    CERN Document Server

    Geiger, Michael; Körkel, Stefan; Rannacher, Rolf

    2015-01-01

    This book offers a comprehensive collection of the most advanced numerical techniques for the efficient and effective solution of simulation and optimization problems governed by systems of time-dependent differential equations. The contributions present various approaches to time domain decomposition, focusing on multiple shooting and parareal algorithms.  The range of topics covers theoretical analysis of the methods, as well as their algorithmic formulation and guidelines for practical implementation. Selected examples show that the discussed approaches are mandatory for the solution of challenging practical problems. The practicability and efficiency of the presented methods is illustrated by several case studies from fluid dynamics, data compression, image processing and computational biology, giving rise to possible new research topics.  This volume, resulting from the workshop Multiple Shooting and Time Domain Decomposition Methods, held in Heidelberg in May 2013, will be of great interest to applied...

  8. On using moving windows in finite element time domain simulation for long accelerator structures

    International Nuclear Information System (INIS)

    Lee, L.-Q.; Candel, Arno; Ng, Cho; Ko, Kwok

    2010-01-01

    A finite element moving window technique is developed to simulate the propagation of electromagnetic waves induced by the transit of a charged particle beam inside large and long structures. The window moving along with the beam in the computational domain adopts high-order finite element basis functions through p refinement and/or a high-resolution mesh through h refinement so that a sufficient accuracy is attained with substantially reduced computational costs. Algorithms to transfer discretized fields from one mesh to another, which are the keys to implementing a moving window in a finite element unstructured mesh, are presented. Numerical experiments are carried out using the moving window technique to compute short-range wakefields in long accelerator structures. The results are compared with those obtained from the normal finite element time domain (FETD) method and the advantages of using the moving window technique are discussed.

  9. Remarkable Computing - the Challenge of Designing for the Home

    DEFF Research Database (Denmark)

    Petersen, Marianne Graves

    2004-01-01

    The vision of ubiquitous computing is floating into the domain of the household, despite arguments that lessons from design of workplace artefacts cannot be blindly transferred into the domain of the household. This paper discusses why the ideal of unremarkable or ubiquitous computing is too narrow...... with respect to the household. It points out how understanding technology use, is a matter of looking into the process of use and on how the specific context of the home, in several ways, call for technology to be remarkable rather than unremarkable....

  10. Computers in technical information transfer

    Energy Technology Data Exchange (ETDEWEB)

    Price, C.E.

    1978-01-01

    The use of computers in transferring scientific and technical information from its creation to its use is surveyed. The traditional publication and distribution processes for S and T literature in past years have been the vehicle for transfer, but computers have altered the process in phenomenal ways. Computers are used in literature publication through text editing and photocomposition applications. Abstracting and indexing services use computers for preparing their products, but the machine-readable document descriptions created for this purpose are input to a rapidly growing computerized information retrieval service industry. Computer use is making many traditional processes passe, and may eventually lead to a largely ''paperless'' information utility.

  11. Parallel computing for homogeneous diffusion and transport equations in neutronics

    International Nuclear Information System (INIS)

    Pinchedez, K.

    1999-06-01

    Parallel computing meets the ever-increasing requirements for neutronic computer code speed and accuracy. In this work, two different approaches have been considered. We first parallelized the sequential algorithm used by the neutronics code CRONOS developed at the French Atomic Energy Commission. The algorithm computes the dominant eigenvalue associated with PN simplified transport equations by a mixed finite element method. Several parallel algorithms have been developed on distributed memory machines. The performances of the parallel algorithms have been studied experimentally by implementation on a T3D Cray and theoretically by complexity models. A comparison of various parallel algorithms has confirmed the chosen implementations. We next applied a domain sub-division technique to the two-group diffusion Eigen problem. In the modal synthesis-based method, the global spectrum is determined from the partial spectra associated with sub-domains. Then the Eigen problem is expanded on a family composed, on the one hand, from eigenfunctions associated with the sub-domains and, on the other hand, from functions corresponding to the contribution from the interface between the sub-domains. For a 2-D homogeneous core, this modal method has been validated and its accuracy has been measured. (author)

  12. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  13. Design of potentially active ligands for SH2 domains by molecular modeling methods

    Directory of Open Access Journals (Sweden)

    Hurmach V. V.

    2014-07-01

    Full Text Available Search for new chemical structures possessing specific biological activity is a complex problem that needs the use of the latest achievements of molecular modeling technologies. It is well known that SH2 domains play a major role in ontogenesis as intermediaries of specific protein-protein interactions. Aim. Developing an algorithm to investigate the properties of SH2 domain binding, search for new potential active compounds for the whole SH2 domains class. Methods. In this paper, we utilize a complex of computer modeling methods to create a generic set of potentially active compounds targeting universally at the whole class of SH2 domains. A cluster analysis of all available three-dimensional structures of SH2 domains was performed and general pharmacophore models were formulated. The models were used for virtual screening of collection of drug-like compounds provided by Enamine Ltd. Results. The design technique for library of potentially active compounds for SH2 domains class was proposed. Conclusions. The original algorithm of SH2 domains research with molecular docking method was developed. Using our algorithm, the active compounds for SH2 domains were found.

  14. Language Choice and Use of Malaysian Public University Lecturers in the Education Domain

    Science.gov (United States)

    Mei, Tam Lee; Abdullah, Ain Nadzimah; Heng, Chan Swee; Kasim, Zalina Binti Mohd

    2016-01-01

    It is a norm for people from a multilingual and multicultural country such as Malaysia to speak at least two or more languages. Thus, the Malaysian multilingual situation resulted in speakers having to make decisions about which languages are to be used for different purposes in different domains. In order to explain the phenomenon of language…

  15. Bringing numerous methods for expression and promoter analysis to a public cloud computing service.

    Science.gov (United States)

    Polanski, Krzysztof; Gao, Bo; Mason, Sam A; Brown, Paul; Ott, Sascha; Denby, Katherine J; Wild, David L

    2018-03-01

    Every year, a large number of novel algorithms are introduced to the scientific community for a myriad of applications, but using these across different research groups is often troublesome, due to suboptimal implementations and specific dependency requirements. This does not have to be the case, as public cloud computing services can easily house tractable implementations within self-contained dependency environments, making the methods easily accessible to a wider public. We have taken 14 popular methods, the majority related to expression data or promoter analysis, developed these up to a good implementation standard and housed the tools in isolated Docker containers which we integrated into the CyVerse Discovery Environment, making these easily usable for a wide community as part of the CyVerse UK project. The integrated apps can be found at http://www.cyverse.org/discovery-environment, while the raw code is available at https://github.com/cyversewarwick and the corresponding Docker images are housed at https://hub.docker.com/r/cyversewarwick/. info@cyverse.warwick.ac.uk or D.L.Wild@warwick.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  16. Cost efficient CFD simulations: Proper selection of domain partitioning strategies

    Science.gov (United States)

    Haddadi, Bahram; Jordan, Christian; Harasek, Michael

    2017-10-01

    Computational Fluid Dynamics (CFD) is one of the most powerful simulation methods, which is used for temporally and spatially resolved solutions of fluid flow, heat transfer, mass transfer, etc. One of the challenges of Computational Fluid Dynamics is the extreme hardware demand. Nowadays super-computers (e.g. High Performance Computing, HPC) featuring multiple CPU cores are applied for solving-the simulation domain is split into partitions for each core. Some of the different methods for partitioning are investigated in this paper. As a practical example, a new open source based solver was utilized for simulating packed bed adsorption, a common separation method within the field of thermal process engineering. Adsorption can for example be applied for removal of trace gases from a gas stream or pure gases production like Hydrogen. For comparing the performance of the partitioning methods, a 60 million cell mesh for a packed bed of spherical adsorbents was created; one second of the adsorption process was simulated. Different partitioning methods available in OpenFOAM® (Scotch, Simple, and Hierarchical) have been used with different numbers of sub-domains. The effect of the different methods and number of processor cores on the simulation speedup and also energy consumption were investigated for two different hardware infrastructures (Vienna Scientific Clusters VSC 2 and VSC 3). As a general recommendation an optimum number of cells per processor core was calculated. Optimized simulation speed, lower energy consumption and consequently the cost effects are reported here.

  17. Quantum-corrected plasmonic field analysis using a time domain PMCHWT integral equation

    KAUST Repository

    Uysal, Ismail E.

    2016-03-13

    When two structures are within sub-nanometer distance of each other, quantum tunneling, i.e., electrons "jumping" from one structure to another, becomes relevant. Classical electromagnetic solvers do not directly account for this additional path of current. In this work, an auxiliary tunnel made of Drude material is used to "connect" the structures as a support for this current path (R. Esteban et al., Nat. Commun., 2012). The plasmonic fields on the resulting connected structure are analyzed using a time domain surface integral equation solver. Time domain samples of the dispersive medium Green function and the dielectric permittivities are computed from the analytical inverse Fourier transform applied to the rational function representation of their frequency domain samples.

  18. Nodal domains on isospectral quantum graphs: the resolution of isospectrality?

    International Nuclear Information System (INIS)

    Band, Ram; Shapira, Talia; Smilansky, Uzy

    2006-01-01

    We present and discuss isospectral quantum graphs which are not isometric. These graphs are the analogues of the isospectral domains in R 2 which were introduced recently in Gordon et al (1992 Bull. Am. Math. Soc. 27 134-8), Chapman (1995 Am. Math. Mon. 102 124), Buser et al (1994 Int. Math. Res. Not. 9 391-400), Okada and Shudo (2001 J. Phys. A: Math. Gen. 34 5911-22), Jakobson et al (2006 J. Comput. Appl. Math. 194 141-55) and Levitin et al (2006 J. Phys. A: Math. Gen. 39 2073-82)) all based on Sunada's construction of isospectral domains (Sunada T 1985 Ann. Math. 121 196-86). After presenting some of the properties of these graphs, we discuss a few examples which support the conjecture that by counting the nodal domains of the corresponding eigenfunctions one can resolve the isospectral ambiguity

  19. Thundercloud: Domain specific information security training for the smart grid

    Science.gov (United States)

    Stites, Joseph

    In this paper, we describe a cloud-based virtual smart grid test bed: ThunderCloud, which is intended to be used for domain-specific security training applicable to the smart grid environment. The test bed consists of virtual machines connected using a virtual internal network. ThunderCloud is remotely accessible, allowing students to undergo educational exercises online. We also describe a series of practical exercises that we have developed for providing the domain-specific training using ThunderCloud. The training exercises and attacks are designed to be realistic and to reflect known vulnerabilities and attacks reported in the smart grid environment. We were able to use ThunderCloud to offer practical domain-specific security training for smart grid environment to computer science students at little or no cost to the department and no risk to any real networks or systems.

  20. PRIMARY SCHOOL PRINCIPALS’ ATTITUDES TOWARDS COMPUTER TECHNOLOGY IN THE USE OF COMPUTER TECHNOLOGY IN SCHOOL ADMINISTRATION

    OpenAIRE

    GÜNBAYI, İlhan; CANTÜRK, Gökhan

    2011-01-01

    The aim of the study is to determine the usage of computer technology in school administration, primary school administrators’ attitudes towards computer technology, administrators’ and teachers’ computer literacy level. The study was modeled as a survey search. The population of the study consists primary school principals, assistant principals in public primary schools in the center of Antalya. The data were collected from 161 (%51) administrator questionnaires in 68 of 129 public primary s...

  1. Exploiting Publication Contents and Collaboration Networks for Collaborator Recommendation.

    Directory of Open Access Journals (Sweden)

    Xiangjie Kong

    Full Text Available Thanks to the proliferation of online social networks, it has become conventional for researchers to communicate and collaborate with each other. Meanwhile, one critical challenge arises, that is, how to find the most relevant and potential collaborators for each researcher? In this work, we propose a novel collaborator recommendation model called CCRec, which combines the information on researchers' publications and collaboration network to generate better recommendation. In order to effectively identify the most potential collaborators for researchers, we adopt a topic clustering model to identify the academic domains, as well as a random walk model to compute researchers' feature vectors. Using DBLP datasets, we conduct benchmarking experiments to examine the performance of CCRec. The experimental results show that CCRec outperforms other state-of-the-art methods in terms of precision, recall and F1 score.

  2. Toxico-Cheminformatics: New and Expanding Public ...

    Science.gov (United States)

    High-throughput screening (HTS) technologies, along with efforts to improve public access to chemical toxicity information resources and to systematize older toxicity studies, have the potential to significantly improve information gathering efforts for chemical assessments and predictive capabilities in toxicology. Important developments include: 1) large and growing public resources that link chemical structures to biological activity and toxicity data in searchable format, and that offer more nuanced and varied representations of activity; 2) standardized relational data models that capture relevant details of chemical treatment and effects of published in vivo experiments; and 3) the generation of large amounts of new data from public efforts that are employing HTS technologies to probe a wide range of bioactivity and cellular processes across large swaths of chemical space. By annotating toxicity data with associated chemical structure information, these efforts link data across diverse study domains (e.g., ‘omics’, HTS, traditional toxicity studies), toxicity domains (carcinogenicity, developmental toxicity, neurotoxicity, immunotoxicity, etc) and database sources (EPA, FDA, NCI, DSSTox, PubChem, GEO, ArrayExpress, etc.). Public initiatives are developing systematized data models of toxicity study areas and introducing standardized templates, controlled vocabularies, hierarchical organization, and powerful relational searching capability across capt

  3. Advancing the Certified in Public Health Examination: A Job Task Analysis.

    Science.gov (United States)

    Kurz, Richard S; Yager, Christopher; Yager, James D; Foster, Allison; Breidenbach, Daniel H; Irwin, Zachary

    In 2014, the National Board of Public Health Examiners performed a job task analysis (JTA) to revise the Certified in Public Health (CPH) examination. The objectives of this study were to describe the development, administration, and results of the JTA survey; to present an analysis of the survey results; and to review the implications of this first-ever public health JTA. An advisory committee of public health professionals developed a list of 200 public health job tasks categorized into 10 work domains. The list of tasks was incorporated into a web-based survey, and a snowball sample of public health professionals provided 4850 usable responses. Respondents rated job tasks as essential (4), very important (3), important (2), not very important (1), and never performed (0). The mean task importance ratings ranged from 2.61 to 3.01 (important to very important). The highest mean ratings were for tasks in the ethics domain (mean rating, 3.01). Respondents ranked 10 of the 200 tasks as the most important, with mean task rankings ranging from 2.98 to 3.39. We found subtle differences between male and female respondents and between master of public health and doctor of public health respondents in their rankings. The JTA established a set of job tasks in 10 public health work domains, and the results provided a foundation for refining the CPH examination. Additional steps are needed to further modify the content outline of the examination. An empirical assessment of public health job tasks, using methods such as principal components analysis, may provide additional insight.

  4. Workshop on Computational Optimization

    CERN Document Server

    2016-01-01

    This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization 2014, held at Warsaw, Poland, September 7-10, 2014. The book presents recent advances in computational optimization. The volume includes important real problems like parameter settings for controlling processes in bioreactor and other processes, resource constrained project scheduling, infection distribution, molecule distance geometry, quantum computing, real-time management and optimal control, bin packing, medical image processing, localization the abrupt atmospheric contamination source and so on. It shows how to develop algorithms for them based on new metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming and others. This research demonstrates how some real-world problems arising in engineering, economics, medicine and other domains can be formulated as optimization tasks.

  5. GPU computing and applications

    CERN Document Server

    See, Simon

    2015-01-01

    This book presents a collection of state of the art research on GPU Computing and Application. The major part of this book is selected from the work presented at the 2013 Symposium on GPU Computing and Applications held in Nanyang Technological University, Singapore (Oct 9, 2013). Three major domains of GPU application are covered in the book including (1) Engineering design and simulation; (2) Biomedical Sciences; and (3) Interactive & Digital Media. The book also addresses the fundamental issues in GPU computing with a focus on big data processing. Researchers and developers in GPU Computing and Applications will benefit from this book. Training professionals and educators can also benefit from this book to learn the possible application of GPU technology in various areas.

  6. Interpretation of NMR relaxation properties of Pin1, a two-domain protein, based on Brownian dynamic simulations

    International Nuclear Information System (INIS)

    Bernado, Pau; Fernandes, Miguel X.; Jacobs, Doris M.; Fiebig, Klaus; Garcia de la Torre, Jose; Pons, Miquel

    2004-01-01

    Many important proteins contain multiple domains connected by flexible linkers. Inter-domain motion is suggested to play a key role in many processes involving molecular recognition. Heteronuclear NMR relaxation is sensitive to motions in the relevant time scales and could provide valuable information on the dynamics of multi-domain proteins. However, the standard analysis based on the separation of global tumbling and fast local motions is no longer valid for multi-domain proteins undergoing internal motions involving complete domains and that take place on the same time scale than the overall motion.The complexity of the motions experienced even for the simplest two-domain proteins are difficult to capture with simple extensions of the classical Lipari-Szabo approach. Hydrodynamic effects are expected to dominate the motion of the individual globular domains, as well as that of the complete protein. Using Pin1 as a test case, we have simulated its motion at the microsecond time scale, at a reasonable computational expense, using Brownian Dynamic simulations on simplified models. The resulting trajectories provide insight on the interplay between global and inter-domain motion and can be analyzed using the recently published method of isotropic Reorientational Mode Dynamics which offer a way of calculating their contribution to heteronuclear relaxation rates. The analysis of trajectories computed with Pin1 models of different flexibility provides a general framework to understand the dynamics of multi-domain proteins and explains some of the observed features in the relaxation rate profile of free Pin1

  7. Interpretation of NMR relaxation properties of Pin1, a two-domain protein, based on Brownian dynamic simulations

    Energy Technology Data Exchange (ETDEWEB)

    Bernado, Pau [Institut de Biologie Structurale, Jean Pierre Ebel (France); Fernandes, Miguel X. [Universidad de Murcia, Departamento de Quimica Fisica, Facultad de Quimica (Spain); Jacobs, Doris M. [Johann Wolfgang Goethe-Universitaet Frankfurt, Institut fuer Organische Chemie und Chemische Biologie (Germany); Fiebig, Klaus [Affinium Pharmaceuticals (Canada); Garcia de la Torre, Jose [Universidad de Murcia, Departamento de Quimica Fisica, Facultad de Quimica (Spain); Pons, Miquel [Laboratori de RMN de Biomolecules, Parc Cientific de Barcelona (Spain)], E-mail: mpons@ub.edu

    2004-05-15

    Many important proteins contain multiple domains connected by flexible linkers. Inter-domain motion is suggested to play a key role in many processes involving molecular recognition. Heteronuclear NMR relaxation is sensitive to motions in the relevant time scales and could provide valuable information on the dynamics of multi-domain proteins. However, the standard analysis based on the separation of global tumbling and fast local motions is no longer valid for multi-domain proteins undergoing internal motions involving complete domains and that take place on the same time scale than the overall motion.The complexity of the motions experienced even for the simplest two-domain proteins are difficult to capture with simple extensions of the classical Lipari-Szabo approach. Hydrodynamic effects are expected to dominate the motion of the individual globular domains, as well as that of the complete protein. Using Pin1 as a test case, we have simulated its motion at the microsecond time scale, at a reasonable computational expense, using Brownian Dynamic simulations on simplified models. The resulting trajectories provide insight on the interplay between global and inter-domain motion and can be analyzed using the recently published method of isotropic Reorientational Mode Dynamics which offer a way of calculating their contribution to heteronuclear relaxation rates. The analysis of trajectories computed with Pin1 models of different flexibility provides a general framework to understand the dynamics of multi-domain proteins and explains some of the observed features in the relaxation rate profile of free Pin1.

  8. Innovative User Interfaces in the Industrial Domain

    OpenAIRE

    Jutterström, Jenny

    2010-01-01

    The goal of this thesis is to explore how the HMI of a process control system can be improved by applying modern interaction technologies. Many new interaction possibilities are arising on the market, while the interaction in the industrial domain still is quite conservative, with computer mouse and keyboard as the central method of interaction. It is believed that by making use of technology available today, the user interface can provide further assistance to the process control operators a...

  9. Parallel time domain solvers for electrically large transient scattering problems

    KAUST Repository

    Liu, Yang

    2014-09-26

    Marching on in time (MOT)-based integral equation solvers represent an increasingly appealing avenue for analyzing transient electromagnetic interactions with large and complex structures. MOT integral equation solvers for analyzing electromagnetic scattering from perfect electrically conducting objects are obtained by enforcing electric field boundary conditions and implicitly time advance electric surface current densities by iteratively solving sparse systems of equations at all time steps. Contrary to finite difference and element competitors, these solvers apply to nonlinear and multi-scale structures comprising geometrically intricate and deep sub-wavelength features residing atop electrically large platforms. Moreover, they are high-order accurate, stable in the low- and high-frequency limits, and applicable to conducting and penetrable structures represented by highly irregular meshes. This presentation reviews some recent advances in the parallel implementations of time domain integral equation solvers, specifically those that leverage multilevel plane-wave time-domain algorithm (PWTD) on modern manycore computer architectures including graphics processing units (GPUs) and distributed memory supercomputers. The GPU-based implementation achieves at least one order of magnitude speedups compared to serial implementations while the distributed parallel implementation are highly scalable to thousands of compute-nodes. A distributed parallel PWTD kernel has been adopted to solve time domain surface/volume integral equations (TDSIE/TDVIE) for analyzing transient scattering from large and complex-shaped perfectly electrically conducting (PEC)/dielectric objects involving ten million/tens of millions of spatial unknowns.

  10. Coproduction as a structural transformation of the public sector

    NARCIS (Netherlands)

    Meijer, Albert

    2016-01-01

    Purpose: Coproduction fundamentally changes the roles of citizens and governments. The purpose of this paper is to enhance the theoretical understanding of the transformative changes in the structural order of the public domain that result from the coproduction of public services.

  11. Computer-Aided Transformation of PDE Models: Languages, Representations, and a Calculus of Operations

    Science.gov (United States)

    2016-01-05

    Computer-aided transformation of PDE models: languages, representations, and a calculus of operations A domain-specific embedded language called...languages, representations, and a calculus of operations Report Title A domain-specific embedded language called ibvp was developed to model initial...Computer-aided transformation of PDE models: languages, representations, and a calculus of operations 1 Vision and background Physical and engineered systems

  12. Frequency-domain waveform inversion using the phase derivative

    KAUST Repository

    Choi, Yun Seok

    2013-09-26

    Phase wrapping in the frequency domain or cycle skipping in the time domain is the major cause of the local minima problem in the waveform inversion when the starting model is far from the true model. Since the phase derivative does not suffer from the wrapping effect, its inversion has the potential of providing a robust and reliable inversion result. We propose a new waveform inversion algorithm using the phase derivative in the frequency domain along with the exponential damping term to attenuate reflections. We estimate the phase derivative, or what we refer to as the instantaneous traveltime, by taking the derivative of the Fourier-transformed wavefield with respect to the angular frequency, dividing it by the wavefield itself and taking the imaginary part. The objective function is constructed using the phase derivative and the gradient of the objective function is computed using the back-propagation algorithm. Numerical examples show that our inversion algorithm with a strong damping generates a tomographic result even for a high ‘single’ frequency, which can be a good initial model for full waveform inversion and migration.

  13. Frequency-domain waveform inversion using the unwrapped phase

    KAUST Repository

    Choi, Yun Seok

    2011-01-01

    Phase wrapping in the frequency-domain (or cycle skipping in the time-domain) is the major cause of the local minima problem in the waveform inversion. The unwrapped phase has the potential to provide us with a robust and reliable waveform inversion, with reduced local minima. We propose a waveform inversion algorithm using the unwrapped phase objective function in the frequency-domain. The unwrapped phase, or what we call the instantaneous traveltime, is given by the imaginary part of dividing the derivative of the wavefield with respect to the angular frequency by the wavefield itself. As a result, the objective function is given a traveltime-like function, which allows us to smooth it and reduce its nonlinearity. The gradient of the objective function is computed using the back-propagation algorithm based on the adjoint-state technique. We apply both our waveform inversion algorithm using the unwrapped phase and the conventional waveform inversion and show that our inversion algorithm gives better convergence to the true model than the conventional waveform inversion. © 2011 Society of Exploration Geophysicists.

  14. BUILDING A COMPLETE FREE AND OPEN SOURCE GIS INFRASTRUCTURE FOR HYDROLOGICAL COMPUTING AND DATA PUBLICATION USING GIS.LAB AND GISQUICK PLATFORMS

    Directory of Open Access Journals (Sweden)

    M. Landa

    2017-07-01

    Full Text Available Building a complete free and open source GIS computing and data publication platform can be a relatively easy task. This paper describes an automated deployment of such platform using two open source software projects – GIS.lab and Gisquick. GIS.lab (http: //web.gislab.io is a project for rapid deployment of a complete, centrally managed and horizontally scalable GIS infrastructure in the local area network, data center or cloud. It provides a comprehensive set of free geospatial software seamlessly integrated into one, easy-to-use system. A platform for GIS computing (in our case demonstrated on hydrological data processing requires core components as a geoprocessing server, map server, and a computation engine as eg. GRASS GIS, SAGA, or other similar GIS software. All these components can be rapidly, and automatically deployed by GIS.lab platform. In our demonstrated solution PyWPS is used for serving WPS processes built on the top of GRASS GIS computation platform. GIS.lab can be easily extended by other components running in Docker containers. This approach is shown on Gisquick seamless integration. Gisquick (http://gisquick.org is an open source platform for publishing geospatial data in the sense of rapid sharing of QGIS projects on the web. The platform consists of QGIS plugin, Django-based server application, QGIS server, and web/mobile clients. In this paper is shown how to easily deploy complete open source GIS infrastructure allowing all required operations as data preparation on desktop, data sharing, and geospatial computation as the service. It also includes data publication in the sense of OGC Web Services and importantly also as interactive web mapping applications.

  15. Fast analysis of wide-band scattering from electrically large targets with time-domain parabolic equation method

    Science.gov (United States)

    He, Zi; Chen, Ru-Shan

    2016-03-01

    An efficient three-dimensional time domain parabolic equation (TDPE) method is proposed to fast analyze the narrow-angle wideband EM scattering properties of electrically large targets. The finite difference (FD) of Crank-Nicolson (CN) scheme is used as the traditional tool to solve the time-domain parabolic equation. However, a huge computational resource is required when the meshes become dense. Therefore, the alternating direction implicit (ADI) scheme is introduced to discretize the time-domain parabolic equation. In this way, the reduced transient scattered fields can be calculated line by line in each transverse plane for any time step with unconditional stability. As a result, less computational resources are required for the proposed ADI-based TDPE method when compared with both the traditional CN-based TDPE method and the finite-different time-domain (FDTD) method. By employing the rotating TDPE method, the complete bistatic RCS can be obtained with encouraging accuracy for any observed angle. Numerical examples are given to demonstrate the accuracy and efficiency of the proposed method.

  16. The YARHG domain: an extracellular domain in search of a function.

    Directory of Open Access Journals (Sweden)

    Penny Coggill

    Full Text Available We have identified a new bacterial protein domain that we hypothesise binds to peptidoglycan. This domain is called the YARHG domain after the most highly conserved sequence-segment. The domain is found in the extracellular space and is likely to be composed of four alpha-helices. The domain is found associated with protein kinase domains, suggesting it is associated with signalling in some bacteria. The domain is also found associated with three different families of peptidases. The large number of different domains that are found associated with YARHG suggests that it is a useful functional module that nature has recombined multiple times.

  17. Comparison of four computational methods for computing Q factors and resonance wavelengths in photonic crystal membrane cavities

    DEFF Research Database (Denmark)

    de Lasson, Jakob Rosenkrantz; Frandsen, Lars Hagedorn; Burger, Sven

    2016-01-01

    We benchmark four state-of-the-art computational methods by computing quality factors and resonance wavelengths in photonic crystal membrane L5 and L9 line defect cavities.The convergence of the methods with respect to resolution, degrees of freedom and number ofmodes is investigated. Special att...... attention is paid to the influence of the size of the computational domain. Convergence is not obtained for some of the methods, indicating that some are moresuitable than others for analyzing line defect cavities....

  18. Materialities of Law: Celebrity Production and the Public Domain

    Directory of Open Access Journals (Sweden)

    Esther Milne

    2009-12-01

    Full Text Available Celebrity production and consumption are powerful socio-economic forces. The celebrity functions as a significant economic resource for the commercial sector and plays a fundamental symbolic role within culture by providing a shared ‘vocabulary’ through which to understand contemporary social relations. A pivotal element of this allure is the process by which the celebrity figure is able to forge an intimate link with its audience, often producing public expressions of profound compassion, respect or revulsion. This process, however, is complicated by emerging participatory media forms whose impact is experienced as new conditions of possibility for celebrity production and consumption. As Marshall argues, video mash-ups of celebrity interviews, such as those of Christian Bale or Tom Cruise, are dramatically changing the relation between celebrity and audience (Marshall, 2006: 640. Meanings produced by these audience remixes challenge the extent to which a celebrity might control her image. So is the celebrity personality, therefore, a public or private commodity? Who owns the celebrity image within remix culture? Although the celebrity figure has been thoroughly researched in relation to its patterns of consumption; semiotic power; and industry construction; less attention has been focused on the forms of celebrity governance enabled by legislative and case law settings. How might the law deal with the significant economic and cultural power exercised within celebrity culture?

  19. A SURVEY ON UBIQUITOUS COMPUTING

    Directory of Open Access Journals (Sweden)

    Vishal Meshram

    2016-01-01

    Full Text Available This work presents a survey of ubiquitous computing research which is the emerging domain that implements communication technologies into day-to-day life activities. This research paper provides a classification of the research areas on the ubiquitous computing paradigm. In this paper, we present common architecture principles of ubiquitous systems and analyze important aspects in context-aware ubiquitous systems. In addition, this research work presents a novel architecture of ubiquitous computing system and a survey of sensors needed for applications in ubiquitous computing. The goals of this research work are three-fold: i serve as a guideline for researchers who are new to ubiquitous computing and want to contribute to this research area, ii provide a novel system architecture for ubiquitous computing system, and iii provides further research directions required into quality-of-service assurance of ubiquitous computing.

  20. Designing a Secure Storage Repository for Sharing Scientific Datasets using Public Clouds

    Energy Technology Data Exchange (ETDEWEB)

    Kumbhare, Alok [Univ. of Southern California, Los Angeles, CA (United States); Simmhan, Yogesth [Univ. of Southern California, Los Angeles, CA (United States); Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States)

    2011-11-14

    As Cloud platforms gain increasing traction among scientific and business communities for outsourcing storage, computing and content delivery, there is also growing concern about the associated loss of control over private data hosted in the Cloud. In this paper, we present an architecture for a secure data repository service designed on top of a public Cloud infrastructure to support multi-disciplinary scientific communities dealing with personal and human subject data, motivated by the smart power grid domain. Our repository model allows users to securely store and share their data in the Cloud without revealing the plain text to unauthorized users, the Cloud storage provider or the repository itself. The system masks file names, user permissions and access patterns while providing auditing capabilities with provable data updates.

  1. Recommended documentation for computer users at ANL. Revision 3

    Energy Technology Data Exchange (ETDEWEB)

    Heiberger, A.A.

    1992-04-01

    Recommended Documentation for Computer Users at ANL is for all users of the services available from the Argonne National Laboratory (ANL) Computing and Telecommunications Division (CTD). This document will guide you in selecting available documentation that will best fill your particular needs. Chapter 1 explains how to use this document to select documents and how to obtain them from the CTD Document Distribution Counter. Chapter 2 contains a table that categorizes available publications. Chapter 3 gives descriptions of the online DOCUMENT command for CMS, and VAX, and the Sun workstation. DOCUMENT allows you to scan for and order documentation that interests you. Chapter 4 lists publications by subject. Categories I and IX cover publications of a general nature and publications on telecommunications and networks respectively. Categories II, III, IV, V, VI, VII, VIII, and X cover publications on specific computer systems. Category XI covers publications on advanced scientific computing at Argonne. Chapter 5 contains abstracts for each publication, all arranged alphabetically. Chapter 6 describes additional publications containing bibliographies and master indexes that the user may find useful. The appendix identifies available computer systems, applications, languages, and libraries.

  2. .Gov Domains API

    Data.gov (United States)

    General Services Administration — This dataset offers the list of all .gov domains, including state, local, and tribal .gov domains. It does not include .mil domains, or other federal domains outside...

  3. A Cross-Domain Explanation of the Metaphor "Teaching as Persuasion."

    Science.gov (United States)

    Woods, Bradford S.; Demerath, Peter

    2001-01-01

    Examines what the metaphor "teaching as persuasion" would mean in the domains of philosophy, anthropology, and teacher education, asserting that if such a metaphor is to be widely accepted by the educational community and the public, then this discussion is necessary. The metaphor suggests that in teacher education, learning to teach…

  4. The Role of Domain Knowledge in Cognitive Modeling of Information Search

    NARCIS (Netherlands)

    Karanam, S.; Jorge-Botana, Guillermo; Olmos, Ricardo; van Oostendorp, H.

    2017-01-01

    Computational cognitive models developed so far do not incorporate individual differences in domain knowledge in predicting user clicks on search result pages. We address this problem using a cognitive model of information search which enables us to use two semantic spaces having a low (non-expert

  5. Open-geometry Fourier modal method: modeling nanophotonic structures in infinite domains

    DEFF Research Database (Denmark)

    Häyrynen, Teppo; de Lasson, Jakob Rosenkrantz; Gregersen, Niels

    2016-01-01

    We present an open-geometry Fourier modal method based on a new combination of open boundary conditions and an efficient k-space discretization. The open boundary of the computational domain is obtained using basis functions that expand the whole space, and the integrals subsequently appearing due...

  6. Essentials of Computational Electromagnetics

    CERN Document Server

    Sheng, Xin-Qing

    2012-01-01

    Essentials of Computational Electromagnetics provides an in-depth introduction of the three main full-wave numerical methods in computational electromagnetics (CEM); namely, the method of moment (MoM), the finite element method (FEM), and the finite-difference time-domain (FDTD) method. Numerous monographs can be found addressing one of the above three methods. However, few give a broad general overview of essentials embodied in these methods, or were published too early to include recent advances. Furthermore, many existing monographs only present the final numerical results without specifyin

  7. Foundations of Neuromorphic Computing

    Science.gov (United States)

    2013-05-01

    paradigms: few sensors/complex computations and many sensors/simple computation. Challenges with Nano-enabled Neuromorphic Chips A wide variety of...FOUNDATIONS OF NEUROMORPHIC COMPUTING MAY 2013 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...2009 – SEP 2012 4. TITLE AND SUBTITLE FOUNDATIONS OF NEUROMORPHIC COMPUTING 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER N/A 5c. PROGRAM

  8. On the Definition of Public Relations: A European View.

    Science.gov (United States)

    Vercic, Dejan; van Ruler, Betteke; Butschi, Gerhard; Flodin, Bertil

    2001-01-01

    Introduces the project on the European Public Relations Body of Knowledge (EBOK). Reviews proposals on the definition, dimensions, and domain of public relations. Confronts these with findings from EBOK. Presents ideas on how to bridge the differences. Proposes ideas for further investigation. (SG)

  9. US Army Weapon Systems Human-Computer Interface (WSHCI) style guide, Version 1

    Energy Technology Data Exchange (ETDEWEB)

    Avery, L.W.; O`Mara, P.A.; Shepard, A.P.

    1996-09-30

    A stated goal of the U.S. Army has been the standardization of the human computer interfaces (HCIS) of its system. Some of the tools being used to accomplish this standardization are HCI design guidelines and style guides. Currently, the Army is employing a number of style guides. While these style guides provide good guidance for the command, control, communications, computers, and intelligence (C4I) domain, they do not necessarily represent the more unique requirements of the Army`s real time and near-real time (RT/NRT) weapon systems. The Office of the Director of Information for Command, Control, Communications, and Computers (DISC4), in conjunction with the Weapon Systems Technical Architecture Working Group (WSTAWG), recognized this need as part of their activities to revise the Army Technical Architecture (ATA). To address this need, DISC4 tasked the Pacific Northwest National Laboratory (PNNL) to develop an Army weapon systems unique HCI style guide. This document, the U.S. Army Weapon Systems Human-Computer Interface (WSHCI) Style Guide, represents the first version of that style guide. The purpose of this document is to provide HCI design guidance for RT/NRT Army systems across the weapon systems domains of ground, aviation, missile, and soldier systems. Each domain should customize and extend this guidance by developing their domain-specific style guides, which will be used to guide the development of future systems within their domains.

  10. Promoter-enhancer interactions identified from Hi-C data using probabilistic models and hierarchical topological domains.

    Science.gov (United States)

    Ron, Gil; Globerson, Yuval; Moran, Dror; Kaplan, Tommy

    2017-12-21

    Proximity-ligation methods such as Hi-C allow us to map physical DNA-DNA interactions along the genome, and reveal its organization into topologically associating domains (TADs). As the Hi-C data accumulate, computational methods were developed for identifying domain borders in multiple cell types and organisms. Here, we present PSYCHIC, a computational approach for analyzing Hi-C data and identifying promoter-enhancer interactions. We use a unified probabilistic model to segment the genome into domains, which we then merge hierarchically and fit using a local background model, allowing us to identify over-represented DNA-DNA interactions across the genome. By analyzing the published Hi-C data sets in human and mouse, we identify hundreds of thousands of putative enhancers and their target genes, and compile an extensive genome-wide catalog of gene regulation in human and mouse. As we show, our predictions are highly enriched for ChIP-seq and DNA accessibility data, evolutionary conservation, eQTLs and other DNA-DNA interaction data.

  11. Scarcity and Environmental Stress in Public Organizations: A Conjectural Essay.

    Science.gov (United States)

    Bozeman, Barry; Slusher, E. Allen

    1979-01-01

    Assuming extreme scarcity, arguments are developed that public organizations could be expected to constrain their domain definition, allow domain selection to dictate technology, seek clientele suited to existing technologies, and, in general, take actions that will ensure that existing technologies are employed at capacity. (Author/IRT)

  12. 78 FR 48170 - Privacy Act of 1974; CMS Computer Match No. 2013-12; HHS Computer Match No. 1307; SSA Computer...

    Science.gov (United States)

    2013-08-07

    ....hhs.gov . SUPPLEMENTARY INFORMATION: The Computer Matching and Privacy Protection Act of 1988 (Public... computer matching involving Federal agencies could be performed and adding certain protections for... Affordability Programs under the Patient Protection and Affordable Care Act''. SECURITY CLASSIFICATION...

  13. When frames align: The interplay between PR, news media, and the public in times of crisis

    NARCIS (Netherlands)

    van der Meer, T.G.L.A.; Verhoeven, P.; Beentjes, H.; Vliegenthart, R.

    2014-01-01

    This study focuses on the frame-building process of organizational-crisis situations in the interplay between the domains public relations (PR), news media, and the public. The purpose of the study is to investigate whether the crisis frames of the domains align over time. To empirically analyze

  14. Same but not alike: Structure, flexibility and energetics of domains in multi-domain proteins are influenced by the presence of other domains.

    Science.gov (United States)

    Vishwanath, Sneha; de Brevern, Alexandre G; Srinivasan, Narayanaswamy

    2018-02-01

    The majority of the proteins encoded in the genomes of eukaryotes contain more than one domain. Reasons for high prevalence of multi-domain proteins in various organisms have been attributed to higher stability and functional and folding advantages over single-domain proteins. Despite these advantages, many proteins are composed of only one domain while their homologous domains are part of multi-domain proteins. In the study presented here, differences in the properties of protein domains in single-domain and multi-domain systems and their influence on functions are discussed. We studied 20 pairs of identical protein domains, which were crystallized in two forms (a) tethered to other proteins domains and (b) tethered to fewer protein domains than (a) or not tethered to any protein domain. Results suggest that tethering of domains in multi-domain proteins influences the structural, dynamic and energetic properties of the constituent protein domains. 50% of the protein domain pairs show significant structural deviations while 90% of the protein domain pairs show differences in dynamics and 12% of the residues show differences in the energetics. To gain further insights on the influence of tethering on the function of the domains, 4 pairs of homologous protein domains, where one of them is a full-length single-domain protein and the other protein domain is a part of a multi-domain protein, were studied. Analyses showed that identical and structurally equivalent functional residues show differential dynamics in homologous protein domains; though comparable dynamics between in-silico generated chimera protein and multi-domain proteins were observed. From these observations, the differences observed in the functions of homologous proteins could be attributed to the presence of tethered domain. Overall, we conclude that tethered domains in multi-domain proteins not only provide stability or folding advantages but also influence pathways resulting in differences in

  15. Public sector leadership: New perspectives for research and practice

    OpenAIRE

    D. Orazi; A.Turrini; G. Valotti

    2013-01-01

    In this paper, we aim to portray the state of the art in public sector leadership in order to recommend directions for research and training practice. To this end, we review the scattered strands of literature on public sector leadership (PSL) and classify them in a single framework. The results of the study suggest that public sector leadership is emerging as a distinctive and autonomous domain in public administration/public management studies, although the debate is still underdeveloped co...

  16. Public debate - radioactive wastes management; Debat public - gestion des dechets radioactifs

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    Between September 2005 and January 2006 a national debate has been organized on the radioactive wastes management. This debate aimed to inform the public and to allow him to give his opinion. This document presents, the reasons of this debate, the operating, the synthesis of the results and technical documents to bring information in the domain of radioactive wastes management. (A.L.B.)

  17. Evaluating tablet computers as a survey tool in rural communities.

    Science.gov (United States)

    Newell, Steve M; Logan, Henrietta L; Guo, Yi; Marks, John G; Shepperd, James A

    2015-01-01

    Although tablet computers offer advantages in data collection over traditional paper-and-pencil methods, little research has examined whether the 2 formats yield similar responses, especially with underserved populations. We compared the 2 survey formats and tested whether participants' responses to common health questionnaires or perceptions of usability differed by survey format. We also tested whether we could replicate established paper-and-pencil findings via tablet computer. We recruited a sample of low-income community members living in the rural southern United States. Participants were 170 residents (black = 49%; white = 36%; other races and missing data = 15%) drawn from 2 counties meeting Florida's state statutory definition of rural with 100 persons or fewer per square mile. We randomly assigned participants to complete scales (Center for Epidemiologic Studies Depression Inventory and Regulatory Focus Questionnaire) along with survey format usability ratings via paper-and-pencil or tablet computer. All participants rated a series of previously validated posters using a tablet computer. Finally, participants completed comparisons of the survey formats and reported survey format preferences. Participants preferred using the tablet computer and showed no significant differences between formats in mean responses, scale reliabilities, or in participants' usability ratings. Overall, participants reported similar scales responses and usability ratings between formats. However, participants reported both preferring and enjoying responding via tablet computer more. Collectively, these findings are among the first data to show that tablet computers represent a suitable substitute among an underrepresented rural sample for paper-and-pencil methodology in survey research. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  18. A modified CoSaMP algorithm for electromagnetic imaging of two dimensional domains

    KAUST Repository

    Sandhu, Ali Imran; Bagci, Hakan

    2017-01-01

    The compressive sampling matching pursuit (CoSaMP) algorithm is used for solving the electromagnetic inverse scattering problem on two-dimensional sparse domains. Since the scattering matrix, which is computed by sampling the Green function, does

  19. Time-domain analytic Solutions of two-wire transmission line excited by a plane-wave field

    Institute of Scientific and Technical Information of China (English)

    Ni Gu-Yan; Yan Li; Yuan Nai-Chang

    2008-01-01

    This paper reports that an analytic method is used to calculate the load responses of the two-wire transmission line excited by a plane-wave directly in the time domain.By the frequency-domain Baum-Liu-Tesche(BLT)equation,the time-domain analytic solutions are obtained and expressed in an infinite geometric series.Moreover,it is shown that there exist only finite nonzero terms in the infinite geometric series if the time variate is at a finite interval.In other word.the time-domain analytic solutions are expanded in a finite geometric series indeed if the time variate is at a finite interval.The computed results are subsequently compared with transient responses obtained by using the frequency-domain BLT equation via a fast Fourier transform,and the agreement is excellent.

  20. Molecular Mechanics of the α-Actinin Rod Domain: Bending, Torsional, and Extensional Behavior

    Science.gov (United States)

    Golji, Javad; Collins, Robert; Mofrad, Mohammad R. K.

    2009-01-01

    α-Actinin is an actin crosslinking molecule that can serve as a scaffold and maintain dynamic actin filament networks. As a crosslinker in the stressed cytoskeleton, α-actinin can retain conformation, function, and strength. α-Actinin has an actin binding domain and a calmodulin homology domain separated by a long rod domain. Using molecular dynamics and normal mode analysis, we suggest that the α-actinin rod domain has flexible terminal regions which can twist and extend under mechanical stress, yet has a highly rigid interior region stabilized by aromatic packing within each spectrin repeat, by electrostatic interactions between the spectrin repeats, and by strong salt bridges between its two anti-parallel monomers. By exploring the natural vibrations of the α-actinin rod domain and by conducting bending molecular dynamics simulations we also predict that bending of the rod domain is possible with minimal force. We introduce computational methods for analyzing the torsional strain of molecules using rotating constraints. Molecular dynamics extension of the α-actinin rod is also performed, demonstrating transduction of the unfolding forces across salt bridges to the associated monomer of the α-actinin rod domain. PMID:19436721

  1. OpenPSTD : The open source pseudospectral time-domain method for acoustic propagation

    NARCIS (Netherlands)

    Hornikx, M.C.J.; Krijnen, T.F.; van Harten, L.

    2016-01-01

    An open source implementation of the Fourier pseudospectral time-domain (PSTD) method for computing the propagation of sound is presented, which is geared towards applications in the built environment. Being a wave-based method, PSTD captures phenomena like diffraction, but maintains efficiency in

  2. Domain-restricted mutation analysis to identify novel driver events in human cancer

    Directory of Open Access Journals (Sweden)

    Sanket Desai

    2017-10-01

    Full Text Available Analysis of mutational spectra across various cancer types has given valuable insights into tumorigenesis. Different approaches have been used to identify novel drivers from the set of somatic mutations, including the methods which use sequence conservation, geometric localization and pathway information. Recent computational methods suggest use of protein domain information for analysis and understanding of the functional consequence of non-synonymous mutations. Similarly, evidence suggests recurrence at specific position in proteins is robust indicators of its functional impact. Building on this, we performed a systematic analysis of TCGA exome derived somatic mutations across 6089 PFAM domains and significantly mutated domains were identified using randomization approach. Multiple alignment of individual domain allowed us to prioritize for conserved residues mutated at analogous positions across different proteins in a statistically disciplined manner. In addition to the known frequently mutated genes, this analysis independently identifies low frequency Meprin and TRAF-Homology (MATH domain in Speckle Type BTB/POZ (SPOP protein, in prostate adenocarcinoma. Results from this analysis will help generate hypotheses about the downstream molecular mechanism resulting in cancer phenotypes.

  3. The Activity-Based Computing Project - A Software Architecture for Pervasive Computing Final Report

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    . Special attention should be drawn to publication [25], which gives an overview of the ABC project to the IEEE Pervasive Computing community; the ACM CHI 2006 [19] paper that documents the implementation of the ABC technology; and the ACM ToCHI paper [12], which is the main publication of the project......, documenting all of the project’s four objectives. All of these publication venues are top-tier journals and conferences within computer science. From a business perspective, the project had the objective of incorporating relevant parts of the ABC technology into the products of Medical Insight, which has been...... done. Moreover, partly based on the research done in the ABC project, the company Cetrea A/S has been founded, which incorporate ABC concepts and technologies in its products. The concepts of activity-based computing have also been researched in cooperation with IBM Research, and the ABC project has...

  4. Dislocation dynamics in non-convex domains using finite elements with embedded discontinuities

    Science.gov (United States)

    Romero, Ignacio; Segurado, Javier; LLorca, Javier

    2008-04-01

    The standard strategy developed by Van der Giessen and Needleman (1995 Modelling Simul. Mater. Sci. Eng. 3 689) to simulate dislocation dynamics in two-dimensional finite domains was modified to account for the effect of dislocations leaving the crystal through a free surface in the case of arbitrary non-convex domains. The new approach incorporates the displacement jumps across the slip segments of the dislocations that have exited the crystal within the finite element analysis carried out to compute the image stresses on the dislocations due to the finite boundaries. This is done in a simple computationally efficient way by embedding the discontinuities in the finite element solution, a strategy often used in the numerical simulation of crack propagation in solids. Two academic examples are presented to validate and demonstrate the extended model and its implementation within a finite element program is detailed in the appendix.

  5. Dislocation dynamics in non-convex domains using finite elements with embedded discontinuities

    International Nuclear Information System (INIS)

    Romero, Ignacio; Segurado, Javier; LLorca, Javier

    2008-01-01

    The standard strategy developed by Van der Giessen and Needleman (1995 Modelling Simul. Mater. Sci. Eng. 3 689) to simulate dislocation dynamics in two-dimensional finite domains was modified to account for the effect of dislocations leaving the crystal through a free surface in the case of arbitrary non-convex domains. The new approach incorporates the displacement jumps across the slip segments of the dislocations that have exited the crystal within the finite element analysis carried out to compute the image stresses on the dislocations due to the finite boundaries. This is done in a simple computationally efficient way by embedding the discontinuities in the finite element solution, a strategy often used in the numerical simulation of crack propagation in solids. Two academic examples are presented to validate and demonstrate the extended model and its implementation within a finite element program is detailed in the appendix

  6. Domain decomposition for the computation of radiosity in lighting simulation; Decomposition de domaines pour le calcul de la radiosite en simulation d'eclairage

    Energy Technology Data Exchange (ETDEWEB)

    Salque, B

    1998-07-01

    This work deals with the equation of radiosity, this equation describes the transport of light energy through a diffuse medium, its resolution enables us to simulate the presence of light sources. The equation of radiosity is an integral equation who admits a unique solution in realistic cases. The different methods of solving are reviewed. The equation of radiosity can not be formulated as the integral form of a classical partial differential equation, but this work shows that the technique of domain decomposition can be successfully applied to the equation of radiosity if this approach is framed by considerations of physics. This method provides a system of independent equations valid for each sub-domain and whose main parameter is luminance. Some numerical examples give an idea of the convergence of the algorithm. This method is applied to the optimization of the shape of a light reflector.

  7. Computational neuroscience a first course

    CERN Document Server

    Mallot, Hanspeter A

    2013-01-01

    Computational Neuroscience - A First Course provides an essential introduction to computational neuroscience and  equips readers with a fundamental understanding of modeling the nervous system at the membrane, cellular, and network level. The book, which grew out of a lecture series held regularly for more than ten years to graduate students in neuroscience with backgrounds in biology, psychology and medicine, takes its readers on a journey through three fundamental domains of computational neuroscience: membrane biophysics, systems theory and artificial neural networks. The required mathematical concepts are kept as intuitive and simple as possible throughout the book, making it fully accessible to readers who are less familiar with mathematics. Overall, Computational Neuroscience - A First Course represents an essential reference guide for all neuroscientists who use computational methods in their daily work, as well as for any theoretical scientist approaching the field of computational neuroscience.

  8. Utility Computing: Reality and Beyond

    Science.gov (United States)

    Ivanov, Ivan I.

    Utility Computing is not a new concept. It involves organizing and providing a wide range of computing-related services as public utilities. Much like water, gas, electricity and telecommunications, the concept of computing as public utility was announced in 1955. Utility Computing remained a concept for near 50 years. Now some models and forms of Utility Computing are emerging such as storage and server virtualization, grid computing, and automated provisioning. Recent trends in Utility Computing as a complex technology involve business procedures that could profoundly transform the nature of companies' IT services, organizational IT strategies and technology infrastructure, and business models. In the ultimate Utility Computing models, organizations will be able to acquire as much IT services as they need, whenever and wherever they need them. Based on networked businesses and new secure online applications, Utility Computing would facilitate "agility-integration" of IT resources and services within and between virtual companies. With the application of Utility Computing there could be concealment of the complexity of IT, reduction of operational expenses, and converting of IT costs to variable `on-demand' services. How far should technology, business and society go to adopt Utility Computing forms, modes and models?

  9. AN ENHANCED METHOD FOREXTENDING COMPUTATION AND RESOURCES BY MINIMIZING SERVICE DELAY IN EDGE CLOUD COMPUTING

    OpenAIRE

    B.Bavishna*1, Mrs.M.Agalya2 & Dr.G.Kavitha3

    2018-01-01

    A lot of research has been done in the field of cloud computing in computing domain. For its effective performance, variety of algorithms has been proposed. The role of virtualization is significant and its performance is dependent on VM Migration and allocation. More of the energy is absorbed in cloud; therefore, the utilization of numerous algorithms is required for saving energy and efficiency enhancement in the proposed work. In the proposed work, green algorithm has been considered with ...

  10. CORAL: aligning conserved core regions across domain families.

    Science.gov (United States)

    Fong, Jessica H; Marchler-Bauer, Aron

    2009-08-01

    Homologous protein families share highly conserved sequence and structure regions that are frequent targets for comparative analysis of related proteins and families. Many protein families, such as the curated domain families in the Conserved Domain Database (CDD), exhibit similar structural cores. To improve accuracy in aligning such protein families, we propose a profile-profile method CORAL that aligns individual core regions as gap-free units. CORAL computes optimal local alignment of two profiles with heuristics to preserve continuity within core regions. We benchmarked its performance on curated domains in CDD, which have pre-defined core regions, against COMPASS, HHalign and PSI-BLAST, using structure superpositions and comprehensive curator-optimized alignments as standards of truth. CORAL improves alignment accuracy on core regions over general profile methods, returning a balanced score of 0.57 for over 80% of all domain families in CDD, compared with the highest balanced score of 0.45 from other methods. Further, CORAL provides E-values to aid in detecting homologous protein families and, by respecting block boundaries, produces alignments with improved 'readability' that facilitate manual refinement. CORAL will be included in future versions of the NCBI Cn3D/CDTree software, which can be downloaded at http://www.ncbi.nlm.nih.gov/Structure/cdtree/cdtree.shtml. Supplementary data are available at Bioinformatics online.

  11. Opening of energy markets: consequences on the missions of public utility and of security of supplies in the domain of electric power and gas; Ouverture des marches energetiques: consequences sur les missions de service public et de securite d'approvisionnement pour l'electricite et le gaz

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    This conference was jointly organized by the International Energy Agency (IEA) and the French ministry of economy, finances, and industry (general direction of energy and raw materials, DGEMP). It was organized in 6 sessions dealing with: 1 - the public utility in the domain of energy: definition of the public utility missions, experience feedback about liberalized markets, public utility obligation and pricing regulation; 2 - the new US energy policy and the lessons learnt from the California crisis; 3 - the security of electric power supplies: concepts of security of supplies, opinion of operators, security of power supplies versus liberalization and investments; 4 - security of gas supplies: markets liberalization and investments, long-term contracts and security of supplies; 5 - debate: how to integrate the objectives of public utility and of security of supplies in a competing market; 6 - conclusions. This document brings together the available talks and transparencies presented at the conference. (J.S.)

  12. Computational intelligence techniques in health care

    CERN Document Server

    Zhou, Wengang; Satheesh, P

    2016-01-01

    This book presents research on emerging computational intelligence techniques and tools, with a particular focus on new trends and applications in health care. Healthcare is a multi-faceted domain, which incorporates advanced decision-making, remote monitoring, healthcare logistics, operational excellence and modern information systems. In recent years, the use of computational intelligence methods to address the scale and the complexity of the problems in healthcare has been investigated. This book discusses various computational intelligence methods that are implemented in applications in different areas of healthcare. It includes contributions by practitioners, technology developers and solution providers.

  13. Cloud computing for radiologists

    OpenAIRE

    Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...

  14. Linking computers for science

    CERN Multimedia

    2005-01-01

    After the success of SETI@home, many other scientists have found computer power donated by the public to be a valuable resource - and sometimes the only possibility to achieve their goals. In July, representatives of several “public resource computing” projects came to CERN to discuss technical issues and R&D activities on the common computing platform they are using, BOINC. This photograph shows the LHC@home screen-saver which uses the BOINC platform: the dots represent protons and the position of the status bar indicates the progress of the calculations. This summer, CERN hosted the first “pangalactic workshop” on BOINC (Berkeley Open Interface for Network Computing). BOINC is modelled on SETI@home, which millions of people have downloaded to help search for signs of extraterrestrial intelligence in radio-astronomical data. BOINC provides a general-purpose framework for scientists to adapt their software to, so that the public can install and run it. An important part of BOINC is managing the...

  15. Lattice QCD with Domain Decomposition on Intel Xeon Phi Co-Processors

    Energy Technology Data Exchange (ETDEWEB)

    Heybrock, Simon; Joo, Balint; Kalamkar, Dhiraj D; Smelyanskiy, Mikhail; Vaidyanathan, Karthikeyan; Wettig, Tilo; Dubey, Pradeep

    2014-12-01

    The gap between the cost of moving data and the cost of computing continues to grow, making it ever harder to design iterative solvers on extreme-scale architectures. This problem can be alleviated by alternative algorithms that reduce the amount of data movement. We investigate this in the context of Lattice Quantum Chromodynamics and implement such an alternative solver algorithm, based on domain decomposition, on Intel Xeon Phi co-processor (KNC) clusters. We demonstrate close-to-linear on-chip scaling to all 60 cores of the KNC. With a mix of single- and half-precision the domain-decomposition method sustains 400-500 Gflop/s per chip. Compared to an optimized KNC implementation of a standard solver [1], our full multi-node domain-decomposition solver strong-scales to more nodes and reduces the time-to-solution by a factor of 5.

  16. Domain decomposed preconditioners with Krylov subspace methods as subdomain solvers

    Energy Technology Data Exchange (ETDEWEB)

    Pernice, M. [Univ. of Utah, Salt Lake City, UT (United States)

    1994-12-31

    Domain decomposed preconditioners for nonsymmetric partial differential equations typically require the solution of problems on the subdomains. Most implementations employ exact solvers to obtain these solutions. Consequently work and storage requirements for the subdomain problems grow rapidly with the size of the subdomain problems. Subdomain solves constitute the single largest computational cost of a domain decomposed preconditioner, and improving the efficiency of this phase of the computation will have a significant impact on the performance of the overall method. The small local memory available on the nodes of most message-passing multicomputers motivates consideration of the use of an iterative method for solving subdomain problems. For large-scale systems of equations that are derived from three-dimensional problems, memory considerations alone may dictate the need for using iterative methods for the subdomain problems. In addition to reduced storage requirements, use of an iterative solver on the subdomains allows flexibility in specifying the accuracy of the subdomain solutions. Substantial savings in solution time is possible if the quality of the domain decomposed preconditioner is not degraded too much by relaxing the accuracy of the subdomain solutions. While some work in this direction has been conducted for symmetric problems, similar studies for nonsymmetric problems appear not to have been pursued. This work represents a first step in this direction, and explores the effectiveness of performing subdomain solves using several transpose-free Krylov subspace methods, GMRES, transpose-free QMR, CGS, and a smoothed version of CGS. Depending on the difficulty of the subdomain problem and the convergence tolerance used, a reduction in solution time is possible in addition to the reduced memory requirements. The domain decomposed preconditioner is a Schur complement method in which the interface operators are approximated using interface probing.

  17. Computer Applications in Production and Engineering

    DEFF Research Database (Denmark)

    Sørensen, Torben

    1997-01-01

    This paper address how neutral product model interfaces can be identified, specified, and implemented to provide intelligent and flexible means for information management in manufacturing of discrete mechanical products.The use of advanced computer based systems, such as CAD, CAE, CNC, and robotics......, offers a potential for significant cost-savings and quality improvements in manufacturing of discrete mechanical products.However, these systems are introduced into production as 'islands of automation' or 'islands of information', and to benefit from the said potential, the systems must be integrated...... domains; the CA(X) systems are placed in two different domains for design and planning, respectively. A third domain within the CIME architecture comprises the automated equipment on the shop floor....

  18. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    Science.gov (United States)

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  19. Computer Software Reviews.

    Science.gov (United States)

    Hawaii State Dept. of Education, Honolulu. Office of Instructional Services.

    Intended to provide guidance in the selection of the best computer software available to support instruction and to make optimal use of schools' financial resources, this publication provides a listing of computer software programs that have been evaluated according to their currency, relevance, and value to Hawaii's educational programs. The…

  20. Information Pre-Processing using Domain Meta-Ontology and Rule Learning System

    Science.gov (United States)

    Ranganathan, Girish R.; Biletskiy, Yevgen

    Around the globe, extraordinary amounts of documents are being created by Enterprises and by users outside these Enterprises. The documents created in the Enterprises constitute the main focus of the present chapter. These documents are used to perform numerous amounts of machine processing. While using thesedocuments for machine processing, lack of semantics of the information in these documents may cause misinterpretation of the information, thereby inhibiting the productiveness of computer assisted analytical work. Hence, it would be profitable to the Enterprises if they use well defined domain ontologies which will serve as rich source(s) of semantics for the information in the documents. These domain ontologies can be created manually, semi-automatically or fully automatically. The focus of this chapter is to propose an intermediate solution which will enable relatively easy creation of these domain ontologies. The process of extracting and capturing domain ontologies from these voluminous documents requires extensive involvement of domain experts and application of methods of ontology learning that are substantially labor intensive; therefore, some intermediate solutions which would assist in capturing domain ontologies must be developed. This chapter proposes a solution in this direction which involves building a meta-ontology that will serve as an intermediate information source for the main domain ontology. This chapter proposes a solution in this direction which involves building a meta-ontology as a rapid approach in conceptualizing a domain of interest from huge amount of source documents. This meta-ontology can be populated by ontological concepts, attributes and relations from documents, and then refined in order to form better domain ontology either through automatic ontology learning methods or some other relevant ontology building approach.