WorldWideScience

Sample records for public domain computer

  1. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Science.gov (United States)

    2010-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... GENERAL PROVISIONS § 201.26 Recordation of documents pertaining to computer shareware and donation of public domain computer software. (a) General. This section prescribes the procedures for submission of...

  2. PUBLIC DOMAIN PROTECTION. USES AND REUSES OF PUBLIC DOMAIN WORKS

    Directory of Open Access Journals (Sweden)

    Monica Adriana LUPAȘCU

    2015-07-01

    Full Text Available This study tries to highlight the necessity of an awareness of the right of access to the public domain, particularly using the example of works whose protection period has expired, as well as the ones which the law considers to be excluded from protection. Such works are used not only by large libraries from around the world, but also by rights holders, via different means of use, including incorporations into original works or adaptations. However, the reuse that follows these uses often only remains at the level of concept, as the notion of the public’s right of access to public domain works is not substantiated, nor is the notion of the correct or legal use of such works.

  3. Public licenses and public domain as alternatives to copyright

    OpenAIRE

    Köppel, Petr

    2012-01-01

    The work first introduces the area of public licenses as a space between the copyright law and public domain. After that, consecutively for proprietary software, free and open source software, open hardware and open content, it maps particular types of public licenses and the accompanying social and cultural movements, puts them in mutual as well as historical context, examines their characteristics and compares them to each other, shows how the public licenses are defined by various accompan...

  4. Domain of attraction computation for tumor dynamics

    NARCIS (Netherlands)

    Doban, A.I.; Lazar, M.

    2014-01-01

    In this paper we propose the use of rational Lyapunov functions to estimate the domain of attraction of the tumor dormancy equilibrium of immune cells-malignant cells interaction dynamics. A procedure for computing rational Lyapunov functions is worked out, with focus on obtaining a meaningful

  5. Domain decomposition methods and parallel computing

    International Nuclear Information System (INIS)

    Meurant, G.

    1991-01-01

    In this paper, we show how to efficiently solve large linear systems on parallel computers. These linear systems arise from discretization of scientific computing problems described by systems of partial differential equations. We show how to get a discrete finite dimensional system from the continuous problem and the chosen conjugate gradient iterative algorithm is briefly described. Then, the different kinds of parallel architectures are reviewed and their advantages and deficiencies are emphasized. We sketch the problems found in programming the conjugate gradient method on parallel computers. For this algorithm to be efficient on parallel machines, domain decomposition techniques are introduced. We give results of numerical experiments showing that these techniques allow a good rate of convergence for the conjugate gradient algorithm as well as computational speeds in excess of a billion of floating point operations per second. (author). 5 refs., 11 figs., 2 tabs., 1 inset

  6. Cultural Heritage and the Public Domain

    Directory of Open Access Journals (Sweden)

    Bas Savenije

    2012-09-01

    by providing their resources on the Internet” (Berlin Declaration 2003. Therefore, in the spirit of the Berlin Declaration, the ARL encourages its members’ libraries to grant all non-commercial users “a free, irrevocable, worldwide, right of access to, and a license to copy, use, distribute, transmit and display the work publicly and to make and distribute derivative works, in any digital medium for any responsible purpose, subject to proper attribution of authorship”. And: “If fees are to be assessed for the use of digitised public domain works, those fees should only apply to commercial uses” (ARL Principles July 2010. In our view, cultural heritage institutions should make public domain material digitised with public funding as widely available as possible for access and reuse. The public sector has the primary responsibility to fund digitisation. The involvement of private partners, however, is encouraged by ARL as well as the Comité des Sages. Private funding for digitisation is a complement to the necessary public investment, especially in times of economic crisis, but should not be seen as a substitute for public funding. As we can see from these reports there are a number of arguments in favour of digitisation and also of providing maximum accessibility to the digitised cultural heritage. In this paper we will investigate the legal aspects of digitisation of cultural heritage, especially public domain material. On the basis of these we will make an inventory of policy considerations regarding reuse. Furthermore, we will describe the conclusions the National Library of the Netherlands (hereafter: KB has formulated and the arguments that support these. In this context we will review public-private partnerships and also the policy of the KB. We will conclude with recommendations for cultural heritage institutions concerning a reuse policy for digitised public domain material.

  7. A Domain-Specific Programming Language for Secure Multiparty Computation

    DEFF Research Database (Denmark)

    Nielsen, Janus Dam; Schwartzbach, Michael Ignatieff

    2007-01-01

    We present a domain-specific programming language for Secure Multiparty Computation (SMC). Information is a resource of vital importance and considerable economic value to individuals, public administration, and private companies. This means that the confidentiality of information is crucial...... on secret values and results are only revealed according to specific protocols. We identify the key linguistic concepts of SMC and bridge the gap between high-level security requirements and low-level cryptographic operations constituting an SMC platform, thus improving the efficiency and security of SMC...

  8. Preserving the positive functions of the public domain in science

    Directory of Open Access Journals (Sweden)

    Pamela Samuelson

    2003-11-01

    Full Text Available Science has advanced in part because data and scientific methodologies have traditionally not been subject to intellectual property protection. In recent years, intellectual property has played a greater role in scientific work. While intellectual property rights may have a positive role to play in some fields of science, so does the public domain. This paper will discuss some of the positive functions of the public domain and ways in which certain legal developments may negatively impact the public domain. It suggests some steps that scientists can take to preserve the positive functions of the public domain for science.

  9. Human-computer interface incorporating personal and application domains

    Science.gov (United States)

    Anderson, Thomas G [Albuquerque, NM

    2011-03-29

    The present invention provides a human-computer interface. The interface includes provision of an application domain, for example corresponding to a three-dimensional application. The user is allowed to navigate and interact with the application domain. The interface also includes a personal domain, offering the user controls and interaction distinct from the application domain. The separation into two domains allows the most suitable interface methods in each: for example, three-dimensional navigation in the application domain, and two- or three-dimensional controls in the personal domain. Transitions between the application domain and the personal domain are under control of the user, and the transition method is substantially independent of the navigation in the application domain. For example, the user can fly through a three-dimensional application domain, and always move to the personal domain by moving a cursor near one extreme of the display.

  10. The Definition, Dimensions, and Domain of Public Relations.

    Science.gov (United States)

    Hutton, James G.

    1999-01-01

    Discusses how the field of public relations has left itself vulnerable to other fields that are making inroads into public relations' traditional domain, and to critics who are filling in their own definitions of public relations. Proposes a definition and a three-dimensional framework to compare competing philosophies of public relations and to…

  11. Computational thinking as an emerging competence domain

    NARCIS (Netherlands)

    Yadav, A.; Good, J.; Voogt, J.; Fisser, P.; Mulder, M.

    2016-01-01

    Computational thinking is a problem-solving skill set, which includes problem decomposition, algorithmic thinking, abstraction, and automation. Even though computational thinking draws upon concepts fundamental to computer science (CS), it has broad application to all disciplines. It has been

  12. Violence defied? : A review of prevention of violence in public and semi-public domain

    NARCIS (Netherlands)

    Knaap, L.M. van der; Nijssen, L.T.J.; Bogaerts, S.

    2006-01-01

    This report provides a synthesis of 48 studies of the effects of the prevention of violence in the public and semi-public domain. The following research questions were states for this study:What measures for the prevention of violence in the public and semi-public domain are known and have been

  13. Computer Science and Technology Publications. NBS Publications List 84.

    Science.gov (United States)

    National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.

    This bibliography lists publications of the Institute for Computer Sciences and Technology of the National Bureau of Standards. Publications are listed by subject in the areas of computer security, computer networking, and automation technology. Sections list publications of: (1) current Federal Information Processing Standards; (2) computer…

  14. Assessment of current cybersecurity practices in the public domain : cyber indications and warnings domain.

    Energy Technology Data Exchange (ETDEWEB)

    Hamlet, Jason R.; Keliiaa, Curtis M.

    2010-09-01

    This report assesses current public domain cyber security practices with respect to cyber indications and warnings. It describes cybersecurity industry and government activities, including cybersecurity tools, methods, practices, and international and government-wide initiatives known to be impacting current practice. Of particular note are the U.S. Government's Trusted Internet Connection (TIC) and 'Einstein' programs, which are serving to consolidate the Government's internet access points and to provide some capability to monitor and mitigate cyber attacks. Next, this report catalogs activities undertaken by various industry and government entities. In addition, it assesses the benchmarks of HPC capability and other HPC attributes that may lend themselves to assist in the solution of this problem. This report draws few conclusions, as it is intended to assess current practice in preparation for future work, however, no explicit references to HPC usage for the purpose of analyzing cyber infrastructure in near-real-time were found in the current practice. This report and a related SAND2010-4766 National Cyber Defense High Performance Computing and Analysis: Concepts, Planning and Roadmap report are intended to provoke discussion throughout a broad audience about developing a cohesive HPC centric solution to wide-area cybersecurity problems.

  15. Domain Decomposition: A Bridge between Nature and Parallel Computers

    Science.gov (United States)

    1992-09-01

    B., "Domain Decomposition Algorithms for Indefinite Elliptic Problems," S"IAM Journal of S; cientific and Statistical (’omputing, Vol. 13, 1992, pp...AD-A256 575 NASA Contractor Report 189709 ICASE Report No. 92-44 ICASE DOMAIN DECOMPOSITION: A BRIDGE BETWEEN NATURE AND PARALLEL COMPUTERS DTIC dE...effectively implemented on dis- tributed memory multiprocessors. In 1990 (as reported in Ref. 38 using the tile algo- rithm), a 103,201-unknown 2D elliptic

  16. Public-domain software for root image analysis

    Directory of Open Access Journals (Sweden)

    Mirian Cristina Gomes Costa

    2014-10-01

    Full Text Available In the search for high efficiency in root studies, computational systems have been developed to analyze digital images. ImageJ and Safira are public-domain systems that may be used for image analysis of washed roots. However, differences in root properties measured using ImageJ and Safira are supposed. This study compared values of root length and surface area obtained with public-domain systems with values obtained by a reference method. Root samples were collected in a banana plantation in an area of a shallower Typic Carbonatic Haplic Cambisol (CXk, and an area of a deeper Typic Haplic Ta Eutrophic Cambisol (CXve, at six depths in five replications. Root images were digitized and the systems ImageJ and Safira used to determine root length and surface area. The line-intersect method modified by Tennant was used as reference; values of root length and surface area measured with the different systems were analyzed by Pearson's correlation coefficient and compared by the confidence interval and t-test. Both systems ImageJ and Safira had positive correlation coefficients with the reference method for root length and surface area data in CXk and CXve. The correlation coefficient ranged from 0.54 to 0.80, with lowest value observed for ImageJ in the measurement of surface area of roots sampled in CXve. The IC (95 % revealed that root length measurements with Safira did not differ from that with the reference method in CXk (-77.3 to 244.0 mm. Regarding surface area measurements, Safira did not differ from the reference method for samples collected in CXk (-530.6 to 565.8 mm² as well as in CXve (-4231 to 612.1 mm². However, measurements with ImageJ were different from those obtained by the reference method, underestimating length and surface area in samples collected in CXk and CXve. Both ImageJ and Safira allow an identification of increases or decreases in root length and surface area. However, Safira results for root length and surface area are

  17. A Public Domain Software Library for Reading and Language Arts.

    Science.gov (United States)

    Balajthy, Ernest

    A three-year project carried out by the Microcomputers and Reading Committee of the New Jersey Reading Association involved the collection, improvement, and distribution of free microcomputer software (public domain programs) designed to deal with reading and writing skills. Acknowledging that this free software is not without limitations (poor…

  18. Error analysis of a public domain pronunciation dictionary

    CSIR Research Space (South Africa)

    Martirosian, O

    2007-11-01

    Full Text Available ], a popular public domain resource that is widely used in English speech processing systems. The techniques being investigated are applied to the lexicon and the results of each step are illustrated using sample entries. The authors found that as many...

  19. Repetitive Domain-Referenced Testing Using Computers: the TITA System.

    Science.gov (United States)

    Olympia, P. L., Jr.

    The TITA (Totally Interactive Testing and Analysis) System algorithm for the repetitive construction of domain-referenced tests utilizes a compact data bank, is highly portable, is useful in any discipline, requires modest computer hardware, and does not present a security problem. Clusters of related keyphrases, statement phrases, and distractors…

  20. Public Domain; Public Interest; Public Funding: Focussing on the ‘three Ps’ in Scientific Research

    Directory of Open Access Journals (Sweden)

    Mags McGinley

    2005-03-01

    Full Text Available The purpose of this paper is to discuss the ‘three Ps’ of scientific research: Public Domain; Public Interest; Public Funding. This is done by examining some of the difficulties faced by scientists engaged in scientific research who may have problems working within the constraints of current copyright and database legislation, where property claims can place obstacles in the way of research, in other words, the public domain. The article then looks at perceptions of the public interest and asks whether copyright and the database right reflect understandings of how this concept should operate. Thirdly, it considers the relevance of public funding for scientific research in the context of both the public domain and of the public interest. Finally, some recent initiatives seeking to change the contours of the legal framework are be examined.

  1. Remotely Piloted Aircraft and War in the Public Relations Domain

    Science.gov (United States)

    2014-10-01

    the terms as they appear in quoted texts. 2. Peter Kreeft, Socratic Logic: A Logic Text Using Socratic Method , Platonic Questions, and Aristotelian...Ronald Brooks.22 This method of refuting an argu- ment reflects option C (above), demonstrating that the conclusion does not follow from the premises...and War in the Public Relations Domain Feature tional Security Assistance Force (ISAF) met to discuss methods of elim- inating civilian casualties in

  2. Modern EMC analysis I time-domain computational schemes

    CERN Document Server

    Kantartzis, Nikolaos V

    2008-01-01

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of contemporary real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, the analysis covers the theory of the finite-difference time-domain, the transmission-line matrix/modeling, and the finite i

  3. New Flutter Analysis Technique for Time-Domain Computational Aeroelasticity

    Science.gov (United States)

    Pak, Chan-Gi; Lung, Shun-Fat

    2017-01-01

    A new time-domain approach for computing flutter speed is presented. Based on the time-history result of aeroelastic simulation, the unknown unsteady aerodynamics model is estimated using a system identification technique. The full aeroelastic model is generated via coupling the estimated unsteady aerodynamic model with the known linear structure model. The critical dynamic pressure is computed and used in the subsequent simulation until the convergence of the critical dynamic pressure is achieved. The proposed method is applied to a benchmark cantilevered rectangular wing.

  4. Agents unleashed a public domain look at agent technology

    CERN Document Server

    Wayner, Peter

    1995-01-01

    Agents Unleashed: A Public Domain Look at Agent Technology covers details of building a secure agent realm. The book discusses the technology for creating seamlessly integrated networks that allow programs to move from machine to machine without leaving a trail of havoc; as well as the technical details of how an agent will move through the network, prove its identity, and execute its code without endangering the host. The text also describes the organization of the host's work processing an agent; error messages, bad agent expulsion, and errors in XLISP-agents; and the simulators of errors, f

  5. Computational design of binding proteins to EGFR domain II.

    Directory of Open Access Journals (Sweden)

    Yoon Sup Choi

    Full Text Available We developed a process to produce novel interactions between two previously unrelated proteins. This process selects protein scaffolds and designs protein interfaces that bind to a surface patch of interest on a target protein. Scaffolds with shapes complementary to the target surface patch were screened using an exhaustive computational search of the human proteome and optimized by directed evolution using phage display. This method was applied to successfully design scaffolds that bind to epidermal growth factor receptor (EGFR domain II, the interface of EGFR dimerization, with high reactivity toward the target surface patch of EGFR domain II. One potential application of these tailor-made protein interactions is the development of therapeutic agents against specific protein targets.

  6. Bringing computational science to the public.

    Science.gov (United States)

    McDonagh, James L; Barker, Daniel; Alderson, Rosanna G

    2016-01-01

    The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.

  7. Computational electrodynamics the finite-difference time-domain method

    CERN Document Server

    Taflove, Allen

    2005-01-01

    This extensively revised and expanded third edition of the Artech House bestseller, Computational Electrodynamics: The Finite-Difference Time-Domain Method, offers engineers the most up-to-date and definitive resource on this critical method for solving Maxwell's equations. The method helps practitioners design antennas, wireless communications devices, high-speed digital and microwave circuits, and integrated optical devices with unsurpassed efficiency. There has been considerable advancement in FDTD computational technology over the past few years, and the third edition brings professionals the very latest details with entirely new chapters on important techniques, major updates on key topics, and new discussions on emerging areas such as nanophotonics. What's more, to supplement the third edition, the authors have created a Web site with solutions to problems, downloadable graphics and videos, and updates, making this new edition the ideal textbook on the subject as well.

  8. Domain analysis of computational science - Fifty years of a scientific computing group

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, M.

    2010-02-23

    I employed bibliometric- and historical-methods to study the domain of the Scientific Computing group at Brookhaven National Laboratory (BNL) for an extended period of fifty years, from 1958 to 2007. I noted and confirmed the growing emergence of interdisciplinarity within the group. I also identified a strong, consistent mathematics and physics orientation within it.

  9. Frequency Domain Computer Programs for Prediction and Analysis of Rail Vehicle Dynamics : Volume 1. Technical Report

    Science.gov (United States)

    1975-12-01

    Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume I defines the general analytical capabilities required for computer programs applicable to single rail vehi...

  10. Time-Domain Terahertz Computed Axial Tomography NDE System

    Science.gov (United States)

    Zimdars, David

    2012-01-01

    NASA has identified the need for advanced non-destructive evaluation (NDE) methods to characterize aging and durability in aircraft materials to improve the safety of the nation's airline fleet. 3D THz tomography can play a major role in detection and characterization of flaws and degradation in aircraft materials, including Kevlar-based composites and Kevlar and Zylon fabric covers for soft-shell fan containment where aging and durability issues are critical. A prototype computed tomography (CT) time-domain (TD) THz imaging system has been used to generate 3D images of several test objects including a TUFI tile (a thermal protection system tile used on the Space Shuttle and possibly the Orion or similar capsules). This TUFI tile had simulated impact damage that was located and the depth of damage determined. The CT motion control gan try was designed and constructed, and then integrated with a T-Ray 4000 control unit and motion controller to create a complete CT TD-THz imaging system prototype. A data collection software script was developed that takes multiple z-axis slices in sequence and saves the data for batch processing. The data collection software was integrated with the ability to batch process the slice data with the CT TD-THz image reconstruction software. The time required to take a single CT slice was decreased from six minutes to approximately one minute by replacing the 320 ps, 100-Hz waveform acquisition system with an 80 ps, 1,000-Hz waveform acquisition system. The TD-THZ computed tomography system was built from pre-existing commercial off-the-shelf subsystems. A CT motion control gantry was constructed from COTS components that can handle larger samples. The motion control gantry allows inspection of sample sizes of up to approximately one cubic foot (.0.03 cubic meters). The system reduced to practice a CT-TDTHz system incorporating a COTS 80- ps/l-kHz waveform scanner. The incorporation of this scanner in the system allows acquisition of 3D

  11. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    Science.gov (United States)

    Beg, Marijan; Pepper, Ryan A.; Fangohr, Hans

    2017-05-01

    Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i) the re-compilation of source code, (ii) the use of configuration files, (iii) the graphical user interface, and (iv) embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF). We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  12. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    Directory of Open Access Journals (Sweden)

    Marijan Beg

    2017-05-01

    Full Text Available Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i the re-compilation of source code, (ii the use of configuration files, (iii the graphical user interface, and (iv embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF. We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  13. Fuzzy Clustering based Methodology for Multidimensional Data Analysis in Computational Forensic Domain

    OpenAIRE

    Kilian Stoffel; Paul Cotofrei; Dong Han

    2012-01-01

    As interdisciplinary domain requiring advanced and innovative methodologies the computational forensics domain is characterized by data being simultaneously large scaled and uncertain multidimensional and approximate. Forensic domain experts trained to discover hidden pattern from crime data are limited in their analysis without the assistance of a computational intelligence approach. In this paper a methodology and an automatic procedure based on fuzzy set theory and designed to infer precis...

  14. Fast resolution of the neutron diffusion equation through public domain Ode codes

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, V.M.; Vidal, V.; Garayoa, J. [Universidad Politecnica de Valencia, Departamento de Sistemas Informaticos, Valencia (Spain); Verdu, G. [Universidad Politecnica de Valencia, Departamento de Ingenieria Quimica y Nuclear, Valencia (Spain); Gomez, R. [I.E.S. de Tavernes Blanques, Valencia (Spain)

    2003-07-01

    The time-dependent neutron diffusion equation is a partial differential equation with source terms. The resolution method usually includes discretizing the spatial domain, obtaining a large system of linear, stiff ordinary differential equations (ODEs), whose resolution is computationally very expensive. Some standard techniques use a fixed time step to solve the ODE system. This can result in errors (if the time step is too large) or in long computing times (if the time step is too little). To speed up the resolution method, two well-known public domain codes have been selected: DASPK and FCVODE that are powerful codes for the resolution of large systems of stiff ODEs. These codes can estimate the error after each time step, and, depending on this estimation can decide which is the new time step and, possibly, which is the integration method to be used in the next step. With these mechanisms, it is possible to keep the overall error below the chosen tolerances, and, when the system behaves smoothly, to take large time steps increasing the execution speed. In this paper we address the use of the public domain codes DASPK and FCVODE for the resolution of the time-dependent neutron diffusion equation. The efficiency of these codes depends largely on the preconditioning of the big systems of linear equations that must be solved. Several pre-conditioners have been programmed and tested; it was found that the multigrid method is the best of the pre-conditioners tested. Also, it has been found that DASPK has performed better than FCVODE, being more robust for our problem.We can conclude that the use of specialized codes for solving large systems of ODEs can reduce drastically the computational work needed for the solution; and combining them with appropriate pre-conditioners, the reduction can be still more important. It has other crucial advantages, since it allows the user to specify the allowed error, which cannot be done in fixed step implementations; this, of course

  15. The Public Use Limitation in Eminent Domain: "Handley v. Cook."

    Science.gov (United States)

    Grill, Donna P.

    1979-01-01

    It is time for the courts to rigorously scrutinize allegations of public use in order to protect the property rights of private individuals. Available from West Virginia Law Review, W.V.U. Law Center, Morgantown, WV 26506. (Author)

  16. The Domain Shared by Computational and Digital Ontology: A Phenomenological Exploration and Analysis

    Science.gov (United States)

    Compton, Bradley Wendell

    2009-01-01

    The purpose of this dissertation is to explore and analyze a domain of research thought to be shared by two areas of philosophy: computational and digital ontology. Computational ontology is philosophy used to develop information systems also called computational ontologies. Digital ontology is philosophy dealing with our understanding of Being…

  17. Time Domain Terahertz Axial Computed Tomography Non Destructive Evaluation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to demonstrate key elements of feasibility for a high speed automated time domain terahertz computed axial tomography (TD-THz CT) non destructive...

  18. Time Domain Terahertz Axial Computed Tomography Non Destructive Evaluation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — In this Phase 2 project, we propose to develop, construct, and deliver to NASA a computed axial tomography time-domain terahertz (CT TD-THz) non destructive...

  19. Computing the Feng-Rao distances for codes from order domains

    DEFF Research Database (Denmark)

    Ruano Benito, Diego

    2007-01-01

    We compute the Feng–Rao distance of a code coming from an order domain with a simplicial value semigroup. The main tool is the Apéry set of a semigroup that can be computed using a Gröbner basis.......We compute the Feng–Rao distance of a code coming from an order domain with a simplicial value semigroup. The main tool is the Apéry set of a semigroup that can be computed using a Gröbner basis....

  20. Materialities of Law: Celebrity Production and the Public Domain

    Directory of Open Access Journals (Sweden)

    Esther Milne

    2009-12-01

    Full Text Available Celebrity production and consumption are powerful socio-economic forces. The celebrity functions as a significant economic resource for the commercial sector and plays a fundamental symbolic role within culture by providing a shared ‘vocabulary’ through which to understand contemporary social relations. A pivotal element of this allure is the process by which the celebrity figure is able to forge an intimate link with its audience, often producing public expressions of profound compassion, respect or revulsion. This process, however, is complicated by emerging participatory media forms whose impact is experienced as new conditions of possibility for celebrity production and consumption. As Marshall argues, video mash-ups of celebrity interviews, such as those of Christian Bale or Tom Cruise, are dramatically changing the relation between celebrity and audience (Marshall, 2006: 640. Meanings produced by these audience remixes challenge the extent to which a celebrity might control her image. So is the celebrity personality, therefore, a public or private commodity? Who owns the celebrity image within remix culture? Although the celebrity figure has been thoroughly researched in relation to its patterns of consumption; semiotic power; and industry construction; less attention has been focused on the forms of celebrity governance enabled by legislative and case law settings. How might the law deal with the significant economic and cultural power exercised within celebrity culture?

  1. 77 FR 4568 - Annual Computational Science Symposium; Public Conference

    Science.gov (United States)

    2012-01-30

    ...] Annual Computational Science Symposium; Public Conference AGENCY: Food and Drug Administration, HHS... with the Pharmaceutical Users Software Exchange (PhUSE), is announcing a public conference entitled ``The FDA/PhUSE Annual Computational Science Symposium.'' The purpose of the conference is to help the...

  2. Survey of Energy Computing in the Smart Grid Domain

    OpenAIRE

    Rajesh Kumar; Arun Agarwala

    2013-01-01

    Resource optimization, with advance computing tools, improves the efficient use of energy resources. The renewable energy resources are instantaneous and needs to be conserve at the same time. To optimize real time process, the complex design, includes plan of resources and control for effective utilization. The advances in information communication technology tools enables data formatting and analysis results in optimization of use the renewable resources for sustainable energy solution on s...

  3. Computer-Based Simulation Games in Public Administration Education

    OpenAIRE

    Kutergina Evgeniia

    2017-01-01

    Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently...

  4. Supporting students' learning in the domain of computer science

    Science.gov (United States)

    Gasparinatou, Alexandra; Grigoriadou, Maria

    2011-03-01

    Previous studies have shown that students with low knowledge understand and learn better from more cohesive texts, whereas high-knowledge students have been shown to learn better from texts of lower cohesion. This study examines whether high-knowledge readers in computer science benefit from a text of low cohesion. Undergraduate students (n = 65) read one of four versions of a text concerning Local Network Topologies, orthogonally varying local and global cohesion. Participants' comprehension was examined through free-recall measure, text-based, bridging-inference, elaborative-inference, problem-solving questions and a sorting task. The results indicated that high-knowledge readers benefited from the low-cohesion text. The interaction of text cohesion and knowledge was reliable for the sorting activity, for elaborative-inference and for problem-solving questions. Although high-knowledge readers performed better in text-based and in bridging-inference questions with the low-cohesion text, the interaction of text cohesion and knowledge was not reliable. The results suggest a more complex view of when and for whom textual cohesion affects comprehension and consequently learning in computer science.

  5. Accumulation of Domain-Specific Physical Inactivity and Presence of Hypertension in Brazilian Public Healthcare System.

    Science.gov (United States)

    Turi, Bruna Camilo; Codogno, Jamile S; Fernandes, Romulo A; Sui, Xuemei; Lavie, Carl J; Blair, Steven N; Monteiro, Henrique Luiz

    2015-11-01

    Hypertension is one of the most common noncommunicable diseases worldwide, and physical inactivity is a risk factor predisposing to its occurrence and complications. However, it is still unclear the association between physical inactivity domains and hypertension, especially in public healthcare systems. Thus, this study aimed to investigate the association between physical inactivity aggregation in different domains and prevalence of hypertension among users of Brazilian public health system. 963 participants composed the sample. Subjects were divided into quartiles groups according to 3 different domains of physical activity (occupational; physical exercises; and leisure-time and transportation). Hypertension was based on physician diagnosis. Physical inactivity in occupational domain was significantly associated with higher prevalence of hypertension (OR = 1.52 [1.05 to 2.21]). The same pattern occurred for physical inactivity in leisure-time (OR = 1.63 [1.11 to 2.39]) and aggregation of physical inactivity in 3 domains (OR = 2.46 [1.14 to 5.32]). However, the multivariate-adjusted model showed significant association between hypertension and physical inactivity in 3 domains (OR = 2.57 [1.14 to 5.79]). The results suggest an unequal prevalence of hypertension according to physical inactivity across different domains and increasing the promotion of physical activity in the healthcare system is needed.

  6. Assessing water availability over peninsular Malaysia using public domain satellite data products

    International Nuclear Information System (INIS)

    Ali, M I; Hashim, M; Zin, H S M

    2014-01-01

    Water availability monitoring is an essential task for water resource sustainability and security. In this paper, the assessment of satellite remote sensing technique for determining water availability is reported. The water-balance analysis is used to compute the spatio-temporal water availability with main inputs; the precipitation and actual evapotranspiration rate (AET), both fully derived from public-domain satellite products of Tropical Rainfall Measurement Mission (TRMM) and MODIS, respectively. Both these satellite products were first subjected to calibration to suit corresponding selected local precipitation and AET samples. Multi-temporal data sets acquired 2000-2010 were used in this study. The results of study, indicated strong agreement of monthly water availability with the basin flow rate (r 2 = 0.5, p < 0.001). Similar agreements were also noted between the estimated annual average water availability with the in-situ measurement. It is therefore concluded that the method devised in this study provide a new alternative for water availability mapping over large area, hence offers the only timely and cost-effective method apart from providing comprehensive spatio-temporal patterns, crucial in water resource planning to ensure water security

  7. Language Choice and Use of Malaysian Public University Lecturers in the Education Domain

    Directory of Open Access Journals (Sweden)

    Tam Lee Mei

    2016-02-01

    Full Text Available It is a norm for people from a multilingual and multicultural country such as Malaysia to speak at least two or more languages. Thus, the Malaysian multilingual situation resulted in speakers having to make decisions about which languages are to be used for different purposes in different domains. In order to explain the phenomenon of language choice, Fishman domain analysis (1964 was adapted into this research. According to Fishman’s domain analysis, language choice and use may depend on the speaker’s experiences situated in different settings, different language repertoires that are available to the speaker, different interlocutors and different topics. Such situations inevitably cause barriers and difficulties to those professionals who work in the education domain. Therefore, the purpose of this research is to explore the language choice and use of Malaysian public university lecturers in the education domain and to investigate whether any significant differences exist between ethnicity and field of study with the English language choice and use of the lecturers. 200 survey questionnaires were distributed to examine the details of the lecturers’ language choice and use. The findings of this research reveal that all of the respondents generally preferred to choose and use English language in both formal and informal education domain. Besides, all of the respondents claimed that they chose and used more than one language. It is also found that ethnicity and field of study of the respondents influence the language choice and use in the education domain. In addition, this research suggested that the language and educational policy makers have been largely successful in raising the role and status of the English language as the medium of instruction in tertiary education while maintaining the Malay language as having an important role in the communicative acts, thus characterizing the lecturers’ language choice and use. Keywords: Language

  8. Parallel computing of a climate model on the dawn 1000 by domain decomposition method

    Science.gov (United States)

    Bi, Xunqiang

    1997-12-01

    In this paper the parallel computing of a grid-point nine-level atmospheric general circulation model on the Dawn 1000 is introduced. The model was developed by the Institute of Atmospheric Physics (IAP), Chinese Academy of Sciences (CAS). The Dawn 1000 is a MIMD massive parallel computer made by National Research Center for Intelligent Computer (NCIC), CAS. A two-dimensional domain decomposition method is adopted to perform the parallel computing. The potential ways to increase the speed-up ratio and exploit more resources of future massively parallel supercomputation are also discussed.

  9. Domains of State-Owned, Privately Held, and Publicly Traded Firms in International Competition.

    Science.gov (United States)

    Mascarenhas, Briance

    1989-01-01

    Hypotheses relating ownership to domain differences among state-owned, publicly traded, and privately held firms in international competition were examined in a controlled field study of the offshore drilling industry. Ownership explained selected differences in domestic market dominance, international presence, and customer orientation, even…

  10. Combating Identity Fraud in the Public Domain: Information Strategies for Healthcare and Criminal Justice

    NARCIS (Netherlands)

    Plomp, M.G.A.; Grijpink, J.H.A.M.

    2011-01-01

    Two trends are present in both the private and public domain: increasing interorganisational co-operation and increasing digitisation. Nowadays, more and more processes within and between organisations take place electronically. These developments are visible on local, national and European scale.

  11. How You Can Protect Public Access Computers "and" Their Users

    Science.gov (United States)

    Huang, Phil

    2007-01-01

    By providing the public with online computing facilities, librarians make available a world of information resources beyond their traditional print materials. Internet-connected computers in libraries greatly enhance the opportunity for patrons to enjoy the benefits of the digital age. Unfortunately, as hackers become more sophisticated and…

  12. Computer-Assisted Management of Instruction in Veterinary Public Health

    Science.gov (United States)

    Holt, Elsbeth; And Others

    1975-01-01

    Reviews a course in Food Hygiene and Public Health at the University of Illinois College of Veterinary Medicine in which students are sequenced through a series of computer-based lessons or autotutorial slide-tape lessons, the computer also being used to route, test, and keep records. Since grades indicated mastery of the subject, the course will…

  13. Public computing options for individuals with cognitive impairments: survey outcomes.

    Science.gov (United States)

    Fox, Lynn Elizabeth; Sohlberg, McKay Moore; Fickas, Stephen; Lemoncello, Rik; Prideaux, Jason

    2009-09-01

    To examine availability and accessibility of public computing for individuals with cognitive impairment (CI) who reside in the USA. A telephone survey was administered as a semi-structured interview to 145 informants representing seven types of public facilities across three geographically distinct regions using a snowball sampling technique. An Internet search of wireless (Wi-Fi) hotspots supplemented the survey. Survey results showed the availability of public computer terminals and Internet hotspots was greatest in the urban sample, followed by the mid-sized and rural cities. Across seven facility types surveyed, libraries had the highest percentage of access barriers, including complex queue procedures, login and password requirements, and limited technical support. University assistive technology centres and facilities with a restricted user policy, such as brain injury centres, had the lowest incidence of access barriers. Findings suggest optimal outcomes for people with CI will result from a careful match of technology and the user that takes into account potential barriers and opportunities to computing in an individual's preferred public environments. Trends in public computing, including the emergence of widespread Wi-Fi and limited access to terminals that permit auto-launch applications, should guide development of technology designed for use in public computing environments.

  14. Code and papers: computing publication patterns in the LHC era

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Publications in scholarly journals establish the body of knowledge deriving from scientific research; they also play a fundamental role in the career path of scientists and in the evaluation criteria of funding agencies. This presentation reviews the evolution of computing-oriented publications in HEP following the start of operation of LHC. Quantitative analyses are illustrated, which document the production of scholarly papers on computing-related topics by HEP experiments and core tools projects (including distributed computing R&D), and the citations they receive. Several scientometric indicators are analyzed to characterize the role of computing in HEP literature. Distinctive features of scholarly publication production in the software-oriented and hardware-oriented experimental HEP communities are highlighted. Current patterns and trends are compared to the situation in previous generations' HEP experiments at LEP, Tevatron and B-factories. The results of this scientometric analysis document objec...

  15. Two-phase flow steam generator simulations on parallel computers using domain decomposition method

    International Nuclear Information System (INIS)

    Belliard, M.

    2003-01-01

    Within the framework of the Domain Decomposition Method (DDM), we present industrial steady state two-phase flow simulations of PWR Steam Generators (SG) using iteration-by-sub-domain methods: standard and Adaptive Dirichlet/Neumann methods (ADN). The averaged mixture balance equations are solved by a Fractional-Step algorithm, jointly with the Crank-Nicholson scheme and the Finite Element Method. The algorithm works with overlapping or non-overlapping sub-domains and with conforming or nonconforming meshing. Computations are run on PC networks or on massively parallel mainframe computers. A CEA code-linker and the PVM package are used (master-slave context). SG mock-up simulations, involving up to 32 sub-domains, highlight the efficiency (speed-up, scalability) and the robustness of the chosen approach. With the DDM, the computational problem size is easily increased to about 1,000,000 cells and the CPU time is significantly reduced. The difficulties related to industrial use are also discussed. (author)

  16. The Value of Privacy and Surveillance Drones in the Public Domain : Scrutinizing the Dutch Flexible Deployment of Mobile Cameras Act

    NARCIS (Netherlands)

    Gerdo Kuiper; Quirine Eijkman

    2017-01-01

    The flexible deployment of drones in the public domain, is in this article assessed from a legal philosophical perspective. On the basis of theories of Dworkin and Moore the distinction between individual rights and collective security policy goals is discussed. Mobile cameras in the public domain

  17. The complexity of changes in the domain of managing public expenditures

    Directory of Open Access Journals (Sweden)

    Dimitrijević Marina

    2016-01-01

    Full Text Available Public expenditures are a huge problem in contemporary states. In the conditions of a global economic crisis and the circumstances involving high level of citizen dissatisfaction related to the former methods of funding and managing the public sector (reflected in ruining the funding sources, irrational spending of public expenditure funds, increase in the budget deficit and the level of public debt, the changes in the domain of managing public expenditures have become a priority. By their nature, these changes are complex and long-lasting, and they should bring significant improvements in the field of public expenditure; they have to provide for lawful and purposeful spending of public funds. It is expected to lower the needed public incomes for financing public expenditure, to improve production and competition in the market economy, and to increase personal consumption, living standard and the quality of life of the population. Regardless of the social, economic, legal or political environment in each of state, the topical issue of reforming the management of public expenditures seems to imply a return to a somewhat neglected need for the public sector to function within its own financial possibilities. The state modernisation processes and advancement in the process of managing public expenditures call for a realistic evaluation of the existing condition and circumstances in which these processes occur, as well as the assessment of potential and actual risks that may hinder their effectiveness. Otherwise, it seems that the establishment of a significant level of responsibility in spending the budget funds and a greater transparency of public expenditure may be far-fetched goals.

  18. Role of Soft Computing Approaches in HealthCare Domain: A Mini Review.

    Science.gov (United States)

    Gambhir, Shalini; Malik, Sanjay Kumar; Kumar, Yugal

    2016-12-01

    In the present era, soft computing approaches play a vital role in solving the different kinds of problems and provide promising solutions. Due to popularity of soft computing approaches, these approaches have also been applied in healthcare data for effectively diagnosing the diseases and obtaining better results in comparison to traditional approaches. Soft computing approaches have the ability to adapt itself according to problem domain. Another aspect is a good balance between exploration and exploitation processes. These aspects make soft computing approaches more powerful, reliable and efficient. The above mentioned characteristics make the soft computing approaches more suitable and competent for health care data. The first objective of this review paper is to identify the various soft computing approaches which are used for diagnosing and predicting the diseases. Second objective is to identify various diseases for which these approaches are applied. Third objective is to categories the soft computing approaches for clinical support system. In literature, it is found that large number of soft computing approaches have been applied for effectively diagnosing and predicting the diseases from healthcare data. Some of these are particle swarm optimization, genetic algorithm, artificial neural network, support vector machine etc. A detailed discussion on these approaches are presented in literature section. This work summarizes various soft computing approaches used in healthcare domain in last one decade. These approaches are categorized in five different categories based on the methodology, these are classification model based system, expert system, fuzzy and neuro fuzzy system, rule based system and case based system. Lot of techniques are discussed in above mentioned categories and all discussed techniques are summarized in the form of tables also. This work also focuses on accuracy rate of soft computing technique and tabular information is provided for

  19. Blockchain-based Public Key Infrastructure for Inter-Domain Secure Routing

    OpenAIRE

    de la Rocha Gómez-Arevalillo , Alfonso; Papadimitratos , Panos

    2017-01-01

    International audience; A gamut of secure inter-domain routing protocols has been proposed in the literature. They use traditional PGP-like and centralized Public Key Infrastructures for trust management. In this paper, we propose our alternative approach for managing security associations, Secure Blockchain Trust Management (SBTM), a trust management system that instantiates a blockchain-based PKI for the operation of securerouting protocols. A main motivation for SBTM is to facilitate gradu...

  20. Suburban development – a search for public domains in Danish suburban neighbourhoods

    DEFF Research Database (Denmark)

    Melgaard, Bente; Bech-Danielsen, Claus

    These years some of the post-war Danish suburbs are facing great challenges – social segregation, demographic changes and challenges in building technology. In particular, segregation prevents social life from unfolding across social, economic and cultural borders. Therefore, in this paper......, potentials for bridge-building across the enclaves of the suburb are looked for through a combined architectural-anthropological mapping of public spaces in a specific suburb in Denmark, the analyses being carried out in the light of Hajer & Reijndorp’s definition of public domains and the term exchange...

  1. Towards development of a high quality public domain global roads database

    Directory of Open Access Journals (Sweden)

    Andrew Nelson

    2006-12-01

    Full Text Available There is clear demand for a global spatial public domain roads data set with improved geographic and temporal coverage, consistent coding of road types, and clear documentation of sources. The currently best available global public domain product covers only one-quarter to one-third of the existing road networks, and this varies considerably by region. Applications for such a data set span multiple sectors and would be particularly valuable for the international economic development, disaster relief, and biodiversity conservation communities, not to mention national and regional agencies and organizations around the world. The building blocks for such a global product are available for many countries and regions, yet thus far there has been neither strategy nor leadership for developing it. This paper evaluates the best available public domain and commercial data sets, assesses the gaps in global coverage, and proposes a number of strategies for filling them. It also identifies stakeholder organizations with an interest in such a data set that might either provide leadership or funding for its development. It closes with a proposed set of actions to begin the process.

  2. Public Services 2.0: The Impact of Social Computing on Public Services

    NARCIS (Netherlands)

    Huijboom, Noor; Broek, Tijs Van Den; Frissen, Valerie; Kool, Linda; Kotterink, Bas; Nielsen, Morten Meyerhoff; Millard, Jeremy

    2009-01-01

    The report gives an overview of the main trends of Social Computing, in the wider context of an evolving public sector, and in relation to relevant government trends and normative policy visions within and across EU Member States on future public services. It then provides an exhaustive literature

  3. On Stability of Exact Transparent Boundary Condition for the Parabolic Equation in Rectangular Computational Domain

    Science.gov (United States)

    Feshchenko, R. M.

    Recently a new exact transparent boundary condition (TBC) for the 3D parabolic wave equation (PWE) in rectangular computational domain was derived. However in the obtained form it does not appear to be unconditionally stable when used with, for instance, the Crank-Nicolson finite-difference scheme. In this paper two new formulations of the TBC for the 3D PWE in rectangular computational domain are reported, which are likely to be unconditionally stable. They are based on an unconditionally stable fully discrete TBC for the Crank-Nicolson scheme for the 2D PWE. These new forms of the TBC can be used for numerical solution of the 3D PWE when a higher precision is required.

  4. Gamma-Weighted Discrete Ordinate Two-Stream Approximation for Computation of Domain Averaged Solar Irradiance

    Science.gov (United States)

    Kato, S.; Smith, G. L.; Barker, H. W.

    2001-01-01

    An algorithm is developed for the gamma-weighted discrete ordinate two-stream approximation that computes profiles of domain-averaged shortwave irradiances for horizontally inhomogeneous cloudy atmospheres. The algorithm assumes that frequency distributions of cloud optical depth at unresolved scales can be represented by a gamma distribution though it neglects net horizontal transport of radiation. This algorithm is an alternative to the one used in earlier studies that adopted the adding method. At present, only overcast cloudy layers are permitted.

  5. Research Progress of Global Land Domain Service Computing:Take GlobeLand 30 as an Example

    Directory of Open Access Journals (Sweden)

    CHEN Jun

    2017-10-01

    Full Text Available Combining service-computing technology with domain requirements is one of the important development directions of geographic information under Internet+, which provides highly efficient technical means for information sharing and collaborative services. Using GlobeLand 30 as an example, this paper analyzes the basic problems of integrating land cover information processing and service computing, introduces the latest research progress in domain service modeling, online computing method and dynamic service technology, and the GlobeLand 30 information service platform. The paper also discusses the further development directions of GlobeLand 30 domain service computing.

  6. Computer-Based Simulation Games in Public Administration Education

    Directory of Open Access Journals (Sweden)

    Kutergina Evgeniia

    2017-12-01

    Full Text Available Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently in Russia the use of computer-based simulation games in Master of Public Administration (MPA curricula is quite limited. Th is paper focuses on computer- based simulation games for students of MPA programmes. Our aim was to analyze outcomes of implementing such games in MPA curricula. We have done so by (1 developing three computer-based simulation games about allocating public finances, (2 testing the games in the learning process, and (3 conducting a posttest examination to evaluate the effect of simulation games on students’ knowledge of municipal finances. Th is study was conducted in the National Research University Higher School of Economics (HSE and in the Russian Presidential Academy of National Economy and Public Administration (RANEPA during the period of September to December 2015, in Saint Petersburg, Russia. Two groups of students were randomly selected in each university and then randomly allocated either to the experimental or the control group. In control groups (n=12 in HSE, n=13 in RANEPA students had traditional lectures. In experimental groups (n=12 in HSE, n=13 in RANEPA students played three simulation games apart from traditional lectures. Th is exploratory research shows that the use of computer-based simulation games in MPA curricula can improve students’ outcomes by 38 %. In general, the experimental groups had better performances on the post-test examination (Figure 2. Students in the HSE experimental group had 27.5 % better

  7. Parallel Implementation of Triangular Cellular Automata for Computing Two-Dimensional Elastodynamic Response on Arbitrary Domains

    Science.gov (United States)

    Leamy, Michael J.; Springer, Adam C.

    In this research we report parallel implementation of a Cellular Automata-based simulation tool for computing elastodynamic response on complex, two-dimensional domains. Elastodynamic simulation using Cellular Automata (CA) has recently been presented as an alternative, inherently object-oriented technique for accurately and efficiently computing linear and nonlinear wave propagation in arbitrarily-shaped geometries. The local, autonomous nature of the method should lead to straight-forward and efficient parallelization. We address this notion on symmetric multiprocessor (SMP) hardware using a Java-based object-oriented CA code implementing triangular state machines (i.e., automata) and the MPI bindings written in Java (MPJ Express). We use MPJ Express to reconfigure our existing CA code to distribute a domain's automata to cores present on a dual quad-core shared-memory system (eight total processors). We note that this message passing parallelization strategy is directly applicable to computer clustered computing, which will be the focus of follow-on research. Results on the shared memory platform indicate nearly-ideal, linear speed-up. We conclude that the CA-based elastodynamic simulator is easily configured to run in parallel, and yields excellent speed-up on SMP hardware.

  8. Computational models of music perception and cognition II: Domain-specific music processing

    Science.gov (United States)

    Purwins, Hendrik; Grachten, Maarten; Herrera, Perfecto; Hazan, Amaury; Marxer, Ricard; Serra, Xavier

    2008-09-01

    In Part I [Purwins H, Herrera P, Grachten M, Hazan A, Marxer R, Serra X. Computational models of music perception and cognition I: The perceptual and cognitive processing chain. Physics of Life Reviews 2008, in press, doi:10.1016/j.plrev.2008.03.004], we addressed the study of cognitive processes that underlie auditory perception of music, and their neural correlates. The aim of the present paper is to summarize empirical findings from music cognition research that are relevant to three prominent music theoretic domains: rhythm, melody, and tonality. Attention is paid to how cognitive processes like category formation, stimulus grouping, and expectation can account for the music theoretic key concepts in these domains, such as beat, meter, voice, consonance. We give an overview of computational models that have been proposed in the literature for a variety of music processing tasks related to rhythm, melody, and tonality. Although the present state-of-the-art in computational modeling of music cognition definitely provides valuable resources for testing specific hypotheses and theories, we observe the need for models that integrate the various aspects of music perception and cognition into a single framework. Such models should be able to account for aspects that until now have only rarely been addressed in computational models of music cognition, like the active nature of perception and the development of cognitive capacities from infancy to adulthood.

  9. Improvements in fast-response flood modeling: desktop parallel computing and domain tracking

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV. OF UTAH

    2009-01-01

    It is becoming increasingly important to have the ability to accurately forecast flooding, as flooding accounts for the most losses due to natural disasters in the world and the United States. Flood inundation modeling has been dominated by one-dimensional approaches. These models are computationally efficient and are considered by many engineers to produce reasonably accurate water surface profiles. However, because the profiles estimated in these models must be superimposed on digital elevation data to create a two-dimensional map, the result may be sensitive to the ability of the elevation data to capture relevant features (e.g. dikes/levees, roads, walls, etc...). Moreover, one-dimensional models do not explicitly represent the complex flow processes present in floodplains and urban environments and because two-dimensional models based on the shallow water equations have significantly greater ability to determine flow velocity and direction, the National Research Council (NRC) has recommended that two-dimensional models be used over one-dimensional models for flood inundation studies. This paper has shown that two-dimensional flood modeling computational time can be greatly reduced through the use of Java multithreading on multi-core computers which effectively provides a means for parallel computing on a desktop computer. In addition, this paper has shown that when desktop parallel computing is coupled with a domain tracking algorithm, significant computation time can be eliminated when computations are completed only on inundated cells. The drastic reduction in computational time shown here enhances the ability of two-dimensional flood inundation models to be used as a near-real time flood forecasting tool, engineering, design tool, or planning tool. Perhaps even of greater significance, the reduction in computation time makes the incorporation of risk and uncertainty/ensemble forecasting more feasible for flood inundation modeling (NRC 2000; Sayers et al

  10. Scalable domain decomposition solvers for stochastic PDEs in high performance computing

    International Nuclear Information System (INIS)

    Desai, Ajit; Pettit, Chris; Poirel, Dominique; Sarkar, Abhijit

    2017-01-01

    Stochastic spectral finite element models of practical engineering systems may involve solutions of linear systems or linearized systems for non-linear problems with billions of unknowns. For stochastic modeling, it is therefore essential to design robust, parallel and scalable algorithms that can efficiently utilize high-performance computing to tackle such large-scale systems. Domain decomposition based iterative solvers can handle such systems. And though these algorithms exhibit excellent scalabilities, significant algorithmic and implementational challenges exist to extend them to solve extreme-scale stochastic systems using emerging computing platforms. Intrusive polynomial chaos expansion based domain decomposition algorithms are extended here to concurrently handle high resolution in both spatial and stochastic domains using an in-house implementation. Sparse iterative solvers with efficient preconditioners are employed to solve the resulting global and subdomain level local systems through multi-level iterative solvers. We also use parallel sparse matrix–vector operations to reduce the floating-point operations and memory requirements. Numerical and parallel scalabilities of these algorithms are presented for the diffusion equation having spatially varying diffusion coefficient modeled by a non-Gaussian stochastic process. Scalability of the solvers with respect to the number of random variables is also investigated.

  11. The International River Interface Cooperative: Public Domain Software for River Flow and Morphodynamics (Invited)

    Science.gov (United States)

    Nelson, J. M.; Shimizu, Y.; McDonald, R.; Takebayashi, H.

    2009-12-01

    The International River Interface Cooperative is an informal organization made up of academic faculty and government scientists with the goal of developing, distributing and providing education for a public-domain software interface for modeling river flow and morphodynamics. Formed in late 2007, the group released the first version of this interface (iRIC) in late 2009. iRIC includes models for two and three-dimensional flow, sediment transport, bed evolution, groundwater-surface water interaction, topographic data processing, and habitat assessment, as well as comprehensive data and model output visualization, mapping, and editing tools. All the tools in iRIC are specifically designed for use in river reaches and utilize common river data sets. The models are couched within a single graphical user interface so that a broad spectrum of models are available to users without learning new pre- and post-processing tools. The first version of iRIC was developed by combining the USGS public-domain Multi-Dimensional Surface Water Modeling System (MD_SWMS), developed at the USGS Geomorphology and Sediment Transport Laboratory in Golden, Colorado, with the public-domain river modeling code NAYS developed by the Universities of Hokkaido and Kyoto, Mizuho Corporation, and the Foundation of the River Disaster Prevention Research Institute in Sapporo, Japan. Since this initial effort, other Universities and Agencies have joined the group, and the interface has been expanded to allow users to integrate their own modeling code using Executable Markup Language (XML), which provides easy access and expandability to the iRIC software interface. In this presentation, the current components of iRIC are described and results from several practical modeling applications are presented to illustrate the capabilities and flexibility of the software. In addition, some future extensions to iRIC are demonstrated, including software for Lagrangian particle tracking and the prediction of

  12. Towards an information strategy for combating identity fraud in the public domain: Cases from healthcare and criminal justice

    NARCIS (Netherlands)

    Plomp, M.G.A.; Grijpink, J.H.A.M.

    2011-01-01

    Two trends are present in both the private and public domain: increasing interorganisational co-operation and increasing digitisation. More and more processes within and between organisations take place electronically, on local, national and European scale. The technological and organisational

  13. Domain Immersion Technique And Free Surface Computations Applied To Extrusion And Mixing Processes

    Science.gov (United States)

    Valette, Rudy; Vergnes, Bruno; Basset, Olivier; Coupez, Thierry

    2007-04-01

    This work focuses on the development of numerical techniques devoted to the simulation of mixing processes of complex fluids such as twin-screw extrusion or batch mixing. In mixing process simulation, the absence of symmetry of the moving boundaries (the screws or the rotors) implies that their rigid body motion has to be taken into account by using a special treatment. We therefore use a mesh immersion technique (MIT), which consists in using a P1+/P1-based (MINI-element) mixed finite element method for solving the velocity-pressure problem and then solving the problem in the whole barrel cavity by imposing a rigid motion (rotation) to nodes found located inside the so called immersed domain, each subdomain (screw, rotor) being represented by a surface CAD mesh (or its mathematical equation in simple cases). The independent meshes are immersed into a unique backgound computational mesh by computing the distance function to their boundaries. Intersections of meshes are accounted for, allowing to compute a fill factor usable as for the VOF methodology. This technique, combined with the use of parallel computing, allows to compute the time-dependent flow of generalized Newtonian fluids including yield stress fluids in a complex system such as a twin screw extruder, including moving free surfaces, which are treated by a "level set" and Hamilton-Jacobi method.

  14. Concrete domains

    OpenAIRE

    Kahn, G.; Plotkin, G.D.

    1993-01-01

    This paper introduces the theory of a particular kind of computation domains called concrete domains. The purpose of this theory is to find a satisfactory framework for the notions of coroutine computation and sequentiality of evaluation.

  15. Hardware architecture design of image restoration based on time-frequency domain computation

    Science.gov (United States)

    Wen, Bo; Zhang, Jing; Jiao, Zipeng

    2013-10-01

    The image restoration algorithms based on time-frequency domain computation is high maturity and applied widely in engineering. To solve the high-speed implementation of these algorithms, the TFDC hardware architecture is proposed. Firstly, the main module is designed, by analyzing the common processing and numerical calculation. Then, to improve the commonality, the iteration control module is planed for iterative algorithms. In addition, to reduce the computational cost and memory requirements, the necessary optimizations are suggested for the time-consuming module, which include two-dimensional FFT/IFFT and the plural calculation. Eventually, the TFDC hardware architecture is adopted for hardware design of real-time image restoration system. The result proves that, the TFDC hardware architecture and its optimizations can be applied to image restoration algorithms based on TFDC, with good algorithm commonality, hardware realizability and high efficiency.

  16. Computational design of selective peptides to discriminate between similar PDZ domains in an oncogenic pathway.

    Science.gov (United States)

    Zheng, Fan; Jewell, Heather; Fitzpatrick, Jeremy; Zhang, Jian; Mierke, Dale F; Grigoryan, Gevorg

    2015-01-30

    Reagents that target protein-protein interactions to rewire signaling are of great relevance in biological research. Computational protein design may offer a means of creating such reagents on demand, but methods for encoding targeting selectivity are sorely needed. This is especially challenging when targeting interactions with ubiquitous recognition modules--for example, PDZ domains, which bind C-terminal sequences of partner proteins. Here we consider the problem of designing selective PDZ inhibitor peptides in the context of an oncogenic signaling pathway, in which two PDZ domains (NHERF-2 PDZ2-N2P2 and MAGI-3 PDZ6-M3P6) compete for a receptor C-terminus to differentially modulate oncogenic activities. Because N2P2 has been shown to increase tumorigenicity and M3P6 to decreases it, we sought to design peptides that inhibit N2P2 without affecting M3P6. We developed a structure-based computational design framework that models peptide flexibility in binding yet is efficient enough to rapidly analyze tradeoffs between affinity and selectivity. Designed peptides showed low-micromolar inhibition constants for N2P2 and no detectable M3P6 binding. Peptides designed for reverse discrimination bound M3P6 tighter than N2P2, further testing our technology. Experimental and computational analysis of selectivity determinants revealed significant indirect energetic coupling in the binding site. Successful discrimination between N2P2 and M3P6, despite their overlapping binding preferences, is highly encouraging for computational approaches to selective PDZ targeting, especially because design relied on a homology model of M3P6. Still, we demonstrate specific deficiencies of structural modeling that must be addressed to enable truly robust design. The presented framework is general and can be applied in many scenarios to engineer selective targeting. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Automatic domain updating technique for improving computational efficiency of 2-D flood-inundation simulation

    Science.gov (United States)

    Tanaka, T.; Tachikawa, Y.; Ichikawa, Y.; Yorozu, K.

    2017-12-01

    Flood is one of the most hazardous disasters and causes serious damage to people and property around the world. To prevent/mitigate flood damage through early warning system and/or river management planning, numerical modelling of flood-inundation processes is essential. In a literature, flood-inundation models have been extensively developed and improved to achieve flood flow simulation with complex topography at high resolution. With increasing demands on flood-inundation modelling, its computational burden is now one of the key issues. Improvements of computational efficiency of full shallow water equations are made from various perspectives such as approximations of the momentum equations, parallelization technique, and coarsening approaches. To support these techniques and more improve the computational efficiency of flood-inundation simulations, this study proposes an Automatic Domain Updating (ADU) method of 2-D flood-inundation simulation. The ADU method traces the wet and dry interface and automatically updates the simulation domain in response to the progress and recession of flood propagation. The updating algorithm is as follow: first, to register the simulation cells potentially flooded at initial stage (such as floodplains nearby river channels), and then if a registered cell is flooded, to register its surrounding cells. The time for this additional process is saved by checking only cells at wet and dry interface. The computation time is reduced by skipping the processing time of non-flooded area. This algorithm is easily applied to any types of 2-D flood inundation models. The proposed ADU method is implemented to 2-D local inertial equations for the Yodo River basin, Japan. Case studies for two flood events show that the simulation is finished within two to 10 times smaller time showing the same result as that without the ADU method.

  18. Logic and memory concepts for all-magnetic computing based on transverse domain walls

    International Nuclear Information System (INIS)

    Vandermeulen, J; Van de Wiele, B; Dupré, L; Van Waeyenberge, B

    2015-01-01

    We introduce a non-volatile digital logic and memory concept in which the binary data is stored in the transverse magnetic domain walls present in in-plane magnetized nanowires with sufficiently small cross sectional dimensions. We assign the digital bit to the two possible orientations of the transverse domain wall. Numerical proofs-of-concept are presented for a NOT-, AND- and OR-gate, a FAN-out as well as a reading and writing device. Contrary to the chirality based vortex domain wall logic gates introduced in Omari and Hayward (2014 Phys. Rev. Appl. 2 044001), the presented concepts remain applicable when miniaturized and are driven by electrical currents, making the technology compatible with the in-plane racetrack memory concept. The individual devices can be easily combined to logic networks working with clock speeds that scale linearly with decreasing design dimensions. This opens opportunities to an all-magnetic computing technology where the digital data is stored and processed under the same magnetic representation. (paper)

  19. Exposure to simultaneous sedentary behavior domains and sociodemographic factors associated in public servants

    Directory of Open Access Journals (Sweden)

    Fernanda Cerveira Fronza

    2017-11-01

    Full Text Available DOI: http://dx.doi.org/10.5007/1980-0037.2017v19n4p469   Exposure to sedentary behavior may contribute to health problems. This study aimed to estimate the prevalence of exposure to simultaneous sedentary behavior domains and verify associated sociodemographic characteristics among technical and administrative servers of a Brazilian university. This is a cross-sectional epidemiological study carried out with 623 technical and administrative servers. Sedentary behavior was identified through a questionnaire in the following domains: commuting (active / passive, sitting time at work, daily time spent watching television and computer use (≥3 hours / day. Sociodemographic variables were age, sex and educational level. The prevalence of servers that had one, two, three and four simultaneous sedentary behavior was 28.4%, 43.2%, 22.5% and 4.3%, respectively. Women were more likely to have three sedentary behavior simultaneously (OR = 1.61, CI 95% = 1.02, 2.56. Servers with 9-11 years of schooling were less exposed to two (OR = 0.27, CI 95% = 0.17, 0.44, three (OR = 0.39, CI 95% = 0.23, 0.66 and four (OR = 0.22, CI 95% = 0.07; 0.69 sedentary behavior simultaneously and those over 12 years of schooling were less likely of having two (OR = 0.22, CI 95% = 0.10; 0.49 and three (OR = 0.15, CI 95% = 0.05, 0.46 sedentary behavior simultaneously. More than half of servers have two sedentary behavior during the week. Having sedentary behavior in more than one domain simultaneously was associated with sex and educational level.

  20. Experience of public procurement of Open Compute servers

    Science.gov (United States)

    Bärring, Olof; Guerri, Marco; Bonfillou, Eric; Valsan, Liviu; Grigore, Alexandru; Dore, Vincent; Gentit, Alain; Clement, Benoît; Grossir, Anthony

    2015-12-01

    The Open Compute Project. OCP (http://www.opencompute.org/). was launched by Facebook in 2011 with the objective of building efficient computing infrastructures at the lowest possible cost. The technologies are released as open hardware. with the goal to develop servers and data centres following the model traditionally associated with open source software projects. In 2013 CERN acquired a few OCP servers in order to compare performance and power consumption with standard hardware. The conclusions were that there are sufficient savings to motivate an attempt to procure a large scale installation. One objective is to evaluate if the OCP market is sufficiently mature and broad enough to meet the constraints of a public procurement. This paper summarizes this procurement. which started in September 2014 and involved the Request for information (RFI) to qualify bidders and Request for Tender (RFT).

  1. Radiological emergencies due to postulated events of melted radioactive material mixed in steel reaching public domain

    International Nuclear Information System (INIS)

    Meena, T.R.; Anoj Kumar; Patra, R.P.; Vikas; Patil, S.S.; Chatterjee, M.K.; Sharma, Ranjit; Murali, S.

    2014-01-01

    National level response mechanism is developed at emergency response centres of DAE (DAE-ERCs) at 22 different locations spread all over the country and National Disaster Response Forces with National Disaster Management Authority (NDMA). ERCs are equipped with radiation monitors, radionuclide identifinders, Personnel Radiation Dosimeters (PRD) with monitoring capabilities of the order of tens of nGy/h (μR/hr) above the radiation background at any suspected locations. Even if small amounts of radioactive material is smuggled and brought in some other form into public domain, ERCs are capable to detect, identify and segregate the radioactive material from any inactive scrap. DAE-ERCs have demonstrated their capability in source search, detection, identification and recovery during the radiological emergency at Mayapuri, New Delhi

  2. Radiological emergencies due to postulated events of melted radioactive material mixed in steel reaching public domain

    Energy Technology Data Exchange (ETDEWEB)

    Meena, T. R.; Kumar, Anoj; Patra, R. P.; Vikas,; Patil, S. S.; Chatterjee, M. K.; Sharma, Ranjit; Murali, S., E-mail: tejram@barc.gov.in [Radiation Safety Systems Division, Bhabha Atomic Research Centre, Mumbai (India)

    2014-07-01

    National level response mechanism is developed at emergency response centres of DAE (DAE-ERCs) at 22 different locations spread all over the country and National Disaster Response Forces with National Disaster Management Authority (NDMA). ERCs are equipped with radiation monitors, radionuclide identifinders, Personnel Radiation Dosimeters (PRD) with monitoring capabilities of the order of tens of nGy/h (μR/hr) above the radiation background at any suspected locations. Even if small amounts of radioactive material is smuggled and brought in some other form into public domain, ERCs are capable to detect, identify and segregate the radioactive material from any inactive scrap. DAE-ERCs have demonstrated their capability in source search, detection, identification and recovery during the radiological emergency at Mayapuri, New Delhi.

  3. New Inversion and Interpretation of Public-Domain Electromagnetic Survey Data from Selected Areas in Alaska

    Science.gov (United States)

    Smith, B. D.; Kass, A.; Saltus, R. W.; Minsley, B. J.; Deszcz-Pan, M.; Bloss, B. R.; Burns, L. E.

    2013-12-01

    Public-domain airborne geophysical surveys (combined electromagnetics and magnetics), mostly collected for and released by the State of Alaska, Division of Geological and Geophysical Surveys (DGGS), are a unique and valuable resource for both geologic interpretation and geophysical methods development. A new joint effort by the US Geological Survey (USGS) and the DGGS aims to add value to these data through the application of novel advanced inversion methods and through innovative and intuitive display of data: maps, profiles, voxel-based models, and displays of estimated inversion quality and confidence. Our goal is to make these data even more valuable for interpretation of geologic frameworks, geotechnical studies, and cryosphere studies, by producing robust estimates of subsurface resistivity that can be used by non-geophysicists. The available datasets, which are available in the public domain, include 39 frequency-domain electromagnetic datasets collected since 1993, and continue to grow with 5 more data releases pending in 2013. The majority of these datasets were flown for mineral resource purposes, with one survey designed for infrastructure analysis. In addition, several USGS datasets are included in this study. The USGS has recently developed new inversion methodologies for airborne EM data and have begun to apply these and other new techniques to the available datasets. These include a trans-dimensional Markov Chain Monte Carlo technique, laterally-constrained regularized inversions, and deterministic inversions which include calibration factors as a free parameter. Incorporation of the magnetic data as an additional constraining dataset has also improved the inversion results. Processing has been completed in several areas, including Fortymile and the Alaska Highway surveys, and continues in others such as the Styx River and Nome surveys. Utilizing these new techniques, we provide models beyond the apparent resistivity maps supplied by the original

  4. Domain interaction in rabbit muscle pyruvate kinase. II. Small angle neutron scattering and computer simulation.

    Science.gov (United States)

    Consler, T G; Uberbacher, E C; Bunick, G J; Liebman, M N; Lee, J C

    1988-02-25

    The effects of ligands on the structure of rabbit muscle pyruvate kinase were studied by small angle neutron scattering. The radius of gyration, RG, decreases by about 1 A in the presence of the substrate phosphoenolpyruvate, but increases by about the same magnitude in the presence of the allosteric inhibitor phenylalanine. With increasing pH or in the absence of Mg2+ and K+, the RG of pyruvate kinase increases. Hence, there is a 2-A difference in RG between two alternative conformations. Length distribution analysis indicates that, under all experimental conditions which increase the radius of gyration, there is a pronounced increase observed in the probability for interatomic distance between 80 and 110 A. These small angle neutron scattering results indicate a "contraction" and "expansion" of the enzyme when it transforms between its active and inactive forms. Using the alpha-carbon coordinates of crystalline cat muscle pyruvate kinase, a length distribution profile was calculated, and it matches the scattering profile of the inactive form. These observations are expected since the crystals were grown in the absence of divalent cations (Stuart, D. I., Levine, M., Muirhead, H., and Stammers, D. K. (1979) J. Mol. Biol. 134, 109-142). Hence, results from neutron scattering, x-ray crystallographic, and sedimentation studies (Oberfelder, R. W., Lee, L. L.-Y., and Lee, J.C. (1984) Biochemistry 23, 3813-3821) are totally consistent with each other. With the aid of computer modeling, the crystal structure has been manipulated in order to effect changes that are consistent with the conformational change described by the solution scattering data. The structural manipulation involves the rotation of the B domain relative to the A domain, leading to the closure of the cleft between these domains. These manipulations resulted in the generation of new sets of atomic (C-alpha) coordinates, which were utilized in calculations, the result of which compared favorably with the

  5. EMGAN: A computer program for time and frequency domain reduction of electromyographic data

    Science.gov (United States)

    Hursta, W. N.

    1975-01-01

    An experiment in electromyography utilizing surface electrode techniques was developed for the Apollo-Soyuz test project. This report describes the computer program, EMGAN, which was written to provide first order data reduction for the experiment. EMG signals are produced by the membrane depolarization of muscle fibers during a muscle contraction. Surface electrodes detect a spatially summated signal from a large number of muscle fibers commonly called an interference pattern. An interference pattern is usually so complex that analysis through signal morphology is extremely difficult if not impossible. It has become common to process EMG interference patterns in the frequency domain. Muscle fatigue and certain myopathic conditions are recognized through changes in muscle frequency spectra.

  6. Computed tear film and osmolarity dynamics on an eye-shaped domain

    Science.gov (United States)

    Li, Longfei; Braun, Richard J.; Driscoll, Tobin A.; Henshaw, William D.; Banks, Jeffrey W.; King-Smith, P. Ewen

    2016-01-01

    The concentration of ions, or osmolarity, in the tear film is a key variable in understanding dry eye symptoms and disease. In this manuscript, we derive a mathematical model that couples osmolarity (treated as a single solute) and fluid dynamics within the tear film on a 2D eye-shaped domain. The model includes the physical effects of evaporation, surface tension, viscosity, ocular surface wettability, osmolarity, osmosis and tear fluid supply and drainage. The governing system of coupled non-linear partial differential equations is solved using the Overture computational framework, together with a hybrid time-stepping scheme, using a variable step backward differentiation formula and a Runge–Kutta–Chebyshev method that were added to the framework. The results of our numerical simulations provide new insight into the osmolarity distribution over the ocular surface during the interblink. PMID:25883248

  7. Domain decomposition parallel computing for transient two-phase flow of nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jae Ryong; Yoon, Han Young [KAERI, Daejeon (Korea, Republic of); Choi, Hyoung Gwon [Seoul National University, Seoul (Korea, Republic of)

    2016-05-15

    KAERI (Korea Atomic Energy Research Institute) has been developing a multi-dimensional two-phase flow code named CUPID for multi-physics and multi-scale thermal hydraulics analysis of Light water reactors (LWRs). The CUPID code has been validated against a set of conceptual problems and experimental data. In this work, the CUPID code has been parallelized based on the domain decomposition method with Message passing interface (MPI) library. For domain decomposition, the CUPID code provides both manual and automatic methods with METIS library. For the effective memory management, the Compressed sparse row (CSR) format is adopted, which is one of the methods to represent the sparse asymmetric matrix. CSR format saves only non-zero value and its position (row and column). By performing the verification for the fundamental problem set, the parallelization of the CUPID has been successfully confirmed. Since the scalability of a parallel simulation is generally known to be better for fine mesh system, three different scales of mesh system are considered: 40000 meshes for coarse mesh system, 320000 meshes for mid-size mesh system, and 2560000 meshes for fine mesh system. In the given geometry, both single- and two-phase calculations were conducted. In addition, two types of preconditioners for a matrix solver were compared: Diagonal and incomplete LU preconditioner. In terms of enhancement of the parallel performance, the OpenMP and MPI hybrid parallel computing for a pressure solver was examined. It is revealed that the scalability of hybrid calculation was enhanced for the multi-core parallel computation.

  8. Computational design of a PDZ domain peptide inhibitor that rescues CFTR activity.

    Directory of Open Access Journals (Sweden)

    Kyle E Roberts

    Full Text Available The cystic fibrosis transmembrane conductance regulator (CFTR is an epithelial chloride channel mutated in patients with cystic fibrosis (CF. The most prevalent CFTR mutation, ΔF508, blocks folding in the endoplasmic reticulum. Recent work has shown that some ΔF508-CFTR channel activity can be recovered by pharmaceutical modulators ("potentiators" and "correctors", but ΔF508-CFTR can still be rapidly degraded via a lysosomal pathway involving the CFTR-associated ligand (CAL, which binds CFTR via a PDZ interaction domain. We present a study that goes from theory, to new structure-based computational design algorithms, to computational predictions, to biochemical testing and ultimately to epithelial-cell validation of novel, effective CAL PDZ inhibitors (called "stabilizers" that rescue ΔF508-CFTR activity. To design the "stabilizers", we extended our structural ensemble-based computational protein redesign algorithm K* to encompass protein-protein and protein-peptide interactions. The computational predictions achieved high accuracy: all of the top-predicted peptide inhibitors bound well to CAL. Furthermore, when compared to state-of-the-art CAL inhibitors, our design methodology achieved higher affinity and increased binding efficiency. The designed inhibitor with the highest affinity for CAL (kCAL01 binds six-fold more tightly than the previous best hexamer (iCAL35, and 170-fold more tightly than the CFTR C-terminus. We show that kCAL01 has physiological activity and can rescue chloride efflux in CF patient-derived airway epithelial cells. Since stabilizers address a different cellular CF defect from potentiators and correctors, our inhibitors provide an additional therapeutic pathway that can be used in conjunction with current methods.

  9. Exploratory analysis regarding the domain definitions for computer based analytical models

    Science.gov (United States)

    Raicu, A.; Oanta, E.; Barhalescu, M.

    2017-08-01

    Our previous computer based studies dedicated to structural problems using analytical methods defined the composite cross section of a beam as a result of Boolean operations with so-called ‘simple’ shapes. Using generalisations, in the class of the ‘simple’ shapes were included areas bounded by curves approximated using spline functions and areas approximated as polygons. However, particular definitions lead to particular solutions. In order to ascend above the actual limitations, we conceived a general definition of the cross sections that are considered now calculus domains consisting of several subdomains. The according set of input data use complex parameterizations. This new vision allows us to naturally assign a general number of attributes to the subdomains. In this way there may be modelled new phenomena that use map-wise information, such as the metal alloys equilibrium diagrams. The hierarchy of the input data text files that use the comma-separated-value format and their structure are also presented and discussed in the paper. This new approach allows us to reuse the concepts and part of the data processing software instruments already developed. The according software to be subsequently developed will be modularised and generalised in order to be used in the upcoming projects that require rapid development of computer based models.

  10. Modeling and Analysis Compute Environments, Utilizing Virtualization Technology in the Climate and Earth Systems Science domain

    Science.gov (United States)

    Michaelis, A.; Nemani, R. R.; Wang, W.; Votava, P.; Hashimoto, H.

    2010-12-01

    Given the increasing complexity of climate modeling and analysis tools, it is often difficult and expensive to build or recreate an exact replica of the software compute environment used in past experiments. With the recent development of new technologies for hardware virtualization, an opportunity exists to create full modeling, analysis and compute environments that are “archiveable”, transferable and may be easily shared amongst a scientific community or presented to a bureaucratic body if the need arises. By encapsulating and entire modeling and analysis environment in a virtual machine image, others may quickly gain access to the fully built system used in past experiments, potentially easing the task and reducing the costs of reproducing and verify past results produced by other researchers. Moreover, these virtual machine images may be used as a pedagogical tool for others that are interested in performing an academic exercise but don't yet possess the broad expertise required. We built two virtual machine images, one with the Community Earth System Model (CESM) and one with Weather Research Forecast Model (WRF), then ran several small experiments to assess the feasibility, performance overheads costs, reusability, and transferability. We present a list of the pros and cons as well as lessoned learned from utilizing virtualization technology in the climate and earth systems modeling domain.

  11. Computational domain discretization in numerical analysis of flow within granular materials

    Science.gov (United States)

    Sosnowski, Marcin

    2018-06-01

    The discretization of computational domain is a crucial step in Computational Fluid Dynamics (CFD) because it influences not only the numerical stability of the analysed model but also the agreement of obtained results and real data. Modelling flow in packed beds of granular materials is a very challenging task in terms of discretization due to the existence of narrow spaces between spherical granules contacting tangentially in a single point. Standard approach to this issue results in a low quality mesh and unreliable results in consequence. Therefore the common method is to reduce the diameter of the modelled granules in order to eliminate the single-point contact between the individual granules. The drawback of such method is the adulteration of flow and contact heat resistance among others. Therefore an innovative method is proposed in the paper: single-point contact is extended to a cylinder-shaped volume contact. Such approach eliminates the low quality mesh elements and simultaneously introduces only slight distortion to the flow as well as contact heat transfer. The performed analysis of numerous test cases prove the great potential of the proposed method of meshing the packed beds of granular materials.

  12. Implicit upwind schemes for computational fluid dynamics. Solution by domain decomposition

    International Nuclear Information System (INIS)

    Clerc, S.

    1998-01-01

    In this work, the numerical simulation of fluid dynamics equations is addressed. Implicit upwind schemes of finite volume type are used for this purpose. The first part of the dissertation deals with the improvement of the computational precision in unfavourable situations. A non-conservative treatment of some source terms is studied in order to correct some shortcomings of the usual operator-splitting method. Besides, finite volume schemes based on Godunov's approach are unsuited to compute low Mach number flows. A modification of the up-winding by preconditioning is introduced to correct this defect. The second part deals with the solution of steady-state problems arising from an implicit discretization of the equations. A well-posed linearized boundary value problem is formulated. We prove the convergence of a domain decomposition algorithm of Schwartz type for this problem. This algorithm is implemented either directly, or in a Schur complement framework. Finally, another approach is proposed, which consists in decomposing the non-linear steady state problem. (author)

  13. An Overview of Public Domain Tools for Measuring the Sustainability of Environmental Remediation - 12060

    Energy Technology Data Exchange (ETDEWEB)

    Claypool, John E.; Rogers, Scott [AECOM, Denver, Colorado, 80202 (United States)

    2012-07-01

    their clients. When it comes to the public domain, Federal government agencies are spearheading the development of software tools to measure and report emissions of air pollutants (e.g., carbon dioxide, other greenhouse gases, criteria air pollutants); consumption of energy, water and natural resources; accident and safety risks; project costs and other economic metrics. Most of the tools developed for the Government are available to environmental practitioners without charge, so they are growing in usage and popularity. The key features and metrics calculated by the available public-domain tools for measuring the sustainability of environmental remediation projects share some commonalities but there are differences amongst the tools. The SiteWise{sup TM} sustainability tool developed for the Navy and US Army will be compared with the Sustainable Remediation Tool (SRT{sup TM}) developed for the US Air Force (USAF). In addition, the USAF's Clean Solar and Wind Energy in Environmental Programs (CleanSWEEP), a soon-to-be-released tool for evaluating the economic feasibility of utilizing renewal energy for powering remediation systems will be described in the paper. (authors)

  14. Response to a widespread, unauthorized dispersal of radioactive waste in the public domain

    International Nuclear Information System (INIS)

    Wenslawski, F.A.; North, H.S.

    1979-01-01

    In March 1976 State of Nevada radiological health officials became aware that radioactive items destined for disposal at a radioactive waste burial facility near Beatty, Nevada had instead been distributed to wide segments of the public domain. Because the facility was jointly licensed by the State of Nevada and the Federal Nuclear Regulatory Commission, both agencies quickly responded. It was learned that over a period of several years a practice existed at the disposal facility of opening containers, removing contents and allowing employees to take items of worth or fancy. Numerous items such as hand tools, electric motors, laboratory instruments, shipping containers, etc., had received widespread and uncontrolled distribution in the town of Beatty as well as lesser distributions to other locations. Because the situation might have had the potential for a significant health and safety impact, a comprehensive recovery operation was conducted. During the course of seven days of intense effort, thirty-five individuals became involved in a comprehensive door by door survey and search of the town. Aerial surveys were performed using a helicopter equipped with sensitive radiation detectors, while ground level scans were conducted using a van containing similar instrumentation. Aerial reconnaissance photographs were taken, a special town meeting was held and numerous persons were interviewed. The recovery effort resulted in a retrieval of an estimated 20 to 25 pickup truck loads of radioactively contaminated equipment as well as several loads of large items returned on a 40-foot flatbed trailer

  15. Moving domain computational fluid dynamics to interface with an embryonic model of cardiac morphogenesis.

    Directory of Open Access Journals (Sweden)

    Juhyun Lee

    Full Text Available Peristaltic contraction of the embryonic heart tube produces time- and spatial-varying wall shear stress (WSS and pressure gradients (∇P across the atrioventricular (AV canal. Zebrafish (Danio rerio are a genetically tractable system to investigate cardiac morphogenesis. The use of Tg(fli1a:EGFP (y1 transgenic embryos allowed for delineation and two-dimensional reconstruction of the endocardium. This time-varying wall motion was then prescribed in a two-dimensional moving domain computational fluid dynamics (CFD model, providing new insights into spatial and temporal variations in WSS and ∇P during cardiac development. The CFD simulations were validated with particle image velocimetry (PIV across the atrioventricular (AV canal, revealing an increase in both velocities and heart rates, but a decrease in the duration of atrial systole from early to later stages. At 20-30 hours post fertilization (hpf, simulation results revealed bidirectional WSS across the AV canal in the heart tube in response to peristaltic motion of the wall. At 40-50 hpf, the tube structure undergoes cardiac looping, accompanied by a nearly 3-fold increase in WSS magnitude. At 110-120 hpf, distinct AV valve, atrium, ventricle, and bulbus arteriosus form, accompanied by incremental increases in both WSS magnitude and ∇P, but a decrease in bi-directional flow. Laminar flow develops across the AV canal at 20-30 hpf, and persists at 110-120 hpf. Reynolds numbers at the AV canal increase from 0.07±0.03 at 20-30 hpf to 0.23±0.07 at 110-120 hpf (p< 0.05, n=6, whereas Womersley numbers remain relatively unchanged from 0.11 to 0.13. Our moving domain simulations highlights hemodynamic changes in relation to cardiac morphogenesis; thereby, providing a 2-D quantitative approach to complement imaging analysis.

  16. Enhancing public access to legal information : A proposal for a new official legal information generic top-level domain

    NARCIS (Netherlands)

    Mitee, Leesi Ebenezer

    2017-01-01

    Abstract: This article examines the use of a new legal information generic Top-Level Domain (gTLD) as a viable tool for easy identification of official legal information websites (OLIWs) and enhancing global public access to their resources. This intervention is necessary because of the existence of

  17. Ethics, big data and computing in epidemiology and public health.

    Science.gov (United States)

    Salerno, Jennifer; Knoppers, Bartha M; Lee, Lisa M; Hlaing, WayWay M; Goodman, Kenneth W

    2017-05-01

    This article reflects on the activities of the Ethics Committee of the American College of Epidemiology (ACE). Members of the Ethics Committee identified an opportunity to elaborate on knowledge gained since the inception of the original Ethics Guidelines published by the ACE Ethics and Standards of Practice Committee in 2000. The ACE Ethics Committee presented a symposium session at the 2016 Epidemiology Congress of the Americas in Miami on the evolving complexities of ethics and epidemiology as it pertains to "big data." This article presents a summary and further discussion of that symposium session. Three topic areas were presented: the policy implications of big data and computing, the fallacy of "secondary" data sources, and the duty of citizens to contribute to big data. A balanced perspective is needed that provides safeguards for individuals but also furthers research to improve population health. Our in-depth review offers next steps for teaching of ethics and epidemiology, as well as for epidemiological research, public health practice, and health policy. To address contemporary topics in the area of ethics and epidemiology, the Ethics Committee hosted a symposium session on the timely topic of big data. Technological advancements in clinical medicine and genetic epidemiology research coupled with rapid advancements in data networks, storage, and computation at a lower cost are resulting in the growth of huge data repositories. Big data increases concerns about data integrity; informed consent; protection of individual privacy, confidentiality, and harm; data reidentification; and the reporting of faulty inferences. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Stimulated Emission Computed Tomography (NSECT) images enhancement using a linear filter in the frequency domain

    Energy Technology Data Exchange (ETDEWEB)

    Viana, Rodrigo S.S.; Tardelli, Tiago C.; Yoriyaz, Helio, E-mail: hyoriyaz@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Jackowski, Marcel P., E-mail: mjack@ime.usp.b [University of Sao Paulo (USP), SP (Brazil). Dept. of Computer Science

    2011-07-01

    In recent years, a new technique for in vivo spectrographic imaging of stable isotopes was presented as Neutron Stimulated Emission Computed Tomography (NSECT). In this technique, a fast neutrons beam stimulates stable nuclei in a sample, which emit characteristic gamma radiation. The photon energy is unique and is used to identify the emitting nuclei. The emitted gamma energy spectra can be used for reconstruction of the target tissue image and for determination of the tissue elemental composition. Due to the stochastic nature of photon emission process by irradiated tissue, one of the most suitable algorithms for tomographic reconstruction is the Expectation-Maximization (E-M) algorithm, once on its formulation are considered simultaneously the probabilities of photons emission and detection. However, a disadvantage of this algorithm is the introduction of noise in the reconstructed image as the number of iterations increases. This increase can be caused either by features of the algorithm itself or by the low sampling rate of projections used for tomographic reconstruction. In this work, a linear filter in the frequency domain was used in order to improve the quality of the reconstructed images. (author)

  19. Stimulated Emission Computed Tomography (NSECT) images enhancement using a linear filter in the frequency domain

    International Nuclear Information System (INIS)

    Viana, Rodrigo S.S.; Tardelli, Tiago C.; Yoriyaz, Helio; Jackowski, Marcel P.

    2011-01-01

    In recent years, a new technique for in vivo spectrographic imaging of stable isotopes was presented as Neutron Stimulated Emission Computed Tomography (NSECT). In this technique, a fast neutrons beam stimulates stable nuclei in a sample, which emit characteristic gamma radiation. The photon energy is unique and is used to identify the emitting nuclei. The emitted gamma energy spectra can be used for reconstruction of the target tissue image and for determination of the tissue elemental composition. Due to the stochastic nature of photon emission process by irradiated tissue, one of the most suitable algorithms for tomographic reconstruction is the Expectation-Maximization (E-M) algorithm, once on its formulation are considered simultaneously the probabilities of photons emission and detection. However, a disadvantage of this algorithm is the introduction of noise in the reconstructed image as the number of iterations increases. This increase can be caused either by features of the algorithm itself or by the low sampling rate of projections used for tomographic reconstruction. In this work, a linear filter in the frequency domain was used in order to improve the quality of the reconstructed images. (author)

  20. The public understanding of nanotechnology in the food domain: the hidden role of views on science, technology, and nature.

    Science.gov (United States)

    Vandermoere, Frederic; Blanchemanche, Sandrine; Bieberstein, Andrea; Marette, Stephan; Roosen, Jutta

    2011-03-01

    In spite of great expectations about the potential of nanotechnology, this study shows that people are rather ambiguous and pessimistic about nanotechnology applications in the food domain. Our findings are drawn from a survey of public perceptions about nanotechnology food and nanotechnology food packaging (N = 752). Multinomial logistic regression analyses further reveal that knowledge about food risks and nanotechnology significantly influences people's views about nanotechnology food packaging. However, knowledge variables were unrelated to support for nanofood, suggesting that an increase in people's knowledge might not be sufficient to bridge the gap between the excitement some business leaders in the food sector have and the restraint of the public. Additionally, opposition to nanofood was not related to the use of heuristics but to trust in governmental agencies. Furthermore, the results indicate that public perceptions of nanoscience in the food domain significantly relate to views on science, technology, and nature.

  1. DATABASES AND THE SUI-GENERIS RIGHT – PROTECTION OUTSIDE THE ORIGINALITY. THE DISREGARD OF THE PUBLIC DOMAIN

    Directory of Open Access Journals (Sweden)

    Monica LUPAȘCU

    2018-05-01

    Full Text Available This study focuses on databases as they are regulated by Directive no.96/9/EC regarding the protection of databases. There are also several references to Romanian Law no.8/1996 on copyright and neighbouring rights which implements the mentioned European Directive. The study analyses certain effects that the sui-generis protection has on public domain. The study tries to demonstrate that the reglementation specific to databases neglects the interests correlated with the public domain. The effect of such a regulation is the abusive creation of some databases in which the public domain (meaning information not protected by copyright such as news, ideas, procedures, methods, systems, processes, concepts, principles, discoveries ends up being encapsulated and made available only to some private interests, the access to public domain being regulated indirectly. The study begins by explaining the sui- generis right and its origin. The first mention of databases can be found in “Green Paper on Copyright (1998,” a document that clearly shows, the database protection was thought to cover a sphere of information non-protectable from the scientific and industrial fields. Several arguments are made by the author, most of them based on the report of the Public Consultation sustained in 2014 in regards to the necessity of the sui-generis right. There are some references made to a specific case law, namely British Houseracing Board vs William Hill and Fixture Marketing Ldt. The ECJ’s decision în that case is of great importance for the support of public interest to access information corresponding to some restrictive fields that are derived as a result of the maker’s activities, because in the absence of the sui-generis right, all this information can be freely accessed and used.

  2. Time-Domain Techniques for Computation and Reconstruction of One-Dimensional Profiles

    Directory of Open Access Journals (Sweden)

    M. Rahman

    2005-01-01

    Full Text Available This paper presents a time-domain technique to compute the electromagnetic fields and to reconstruct the permittivity profile within a one-dimensional medium of finite length. The medium is characterized by a permittivity as well as conductivity profile which vary only with depth. The discussed scattering problem is thus one-dimensional. The modeling tool is divided into two different schemes which are named as the forward solver and the inverse solver. The task of the forward solver is to compute the internal fields of the specimen which is performed by Green’s function approach. When a known electromagnetic wave is incident normally on the media, the resulting electromagnetic field within the media can be calculated by constructing a Green’s operator. This operator maps the incident field on either side of the medium to the field at an arbitrary observation point. It is nothing but a matrix of integral operators with kernels satisfying known partial differential equations. The reflection and transmission behavior of the medium is also determined from the boundary values of the Green's operator. The inverse solver is responsible for solving an inverse scattering problem by reconstructing the permittivity profile of the medium. Though it is possible to use several algorithms to solve this problem, the invariant embedding method, also known as the layer-stripping method, has been implemented here due to the advantage that it requires a finite time trace of reflection data. Here only one round trip of reflection data is used, where one round trip is defined by the time required by the pulse to propagate through the medium and back again. The inversion process begins by retrieving the reflection kernel from the reflected wave data by simply using a deconvolution technique. The rest of the task can easily be performed by applying a numerical approach to determine different profile parameters. Both the solvers have been found to have the

  3. Domain Engineering

    Science.gov (United States)

    Bjørner, Dines

    Before software can be designed we must know its requirements. Before requirements can be expressed we must understand the domain. So it follows, from our dogma, that we must first establish precise descriptions of domains; then, from such descriptions, “derive” at least domain and interface requirements; and from those and machine requirements design the software, or, more generally, the computing systems.

  4. Into the Dark Domain: The UK Web Archive as a Source for the Contemporary History of Public Health

    Science.gov (United States)

    Gorsky, Martin

    2015-01-01

    With the migration of the written record from paper to digital format, archivists and historians must urgently consider how web content should be conserved, retrieved and analysed. The British Library has recently acquired a large number of UK domain websites, captured 1996–2010, which is colloquially termed the Dark Domain Archive while technical issues surrounding user access are resolved. This article reports the results of an invited pilot project that explores methodological issues surrounding use of this archive. It asks how the relationship between UK public health and local government was represented on the web, drawing on the ‘declinist’ historiography to frame its questions. It points up some difficulties in developing an aggregate picture of web content due to duplication of sites. It also highlights their potential for thematic and discourse analysis, using both text and image, illustrated through an argument about the contradictory rationale for public health policy under New Labour. PMID:26217072

  5. DAE emergency response centre (ERC) at Kalpakkam for response to nuclear and radiological emergencies in public domain

    International Nuclear Information System (INIS)

    Meenakshisundaram, V.; Rajagopal, V.; Mathiyarasu, R.; Subramanian, V.; Rajaram, S.; Somayaji, K.M.; Kannan, V.; Rajagopalan, H.

    2008-01-01

    In India, Department of Atomic Energy (DAE) has been identified as the nodal agency/authority in respect of providing the necessary technical inputs in the event of any radiation emergency that may occur in public domain. The overall system takes into consideration statutory requirements, executive decisions as well as National and International obligations. This paper highlights the details about the strength of the Kalpakkam ERC and other essential requisites and their compliance since its formation

  6. Modes of Interaction of Pleckstrin Homology Domains with Membranes: Toward a Computational Biochemistry of Membrane Recognition.

    Science.gov (United States)

    Naughton, Fiona B; Kalli, Antreas C; Sansom, Mark S P

    2018-02-02

    Pleckstrin homology (PH) domains mediate protein-membrane interactions by binding to phosphatidylinositol phosphate (PIP) molecules. The structural and energetic basis of selective PH-PIP interactions is central to understanding many cellular processes, yet the molecular complexities of the PH-PIP interactions are largely unknown. Molecular dynamics simulations using a coarse-grained model enables estimation of free-energy landscapes for the interactions of 12 different PH domains with membranes containing PIP 2 or PIP 3 , allowing us to obtain a detailed molecular energetic understanding of the complexities of the interactions of the PH domains with PIP molecules in membranes. Distinct binding modes, corresponding to different distributions of cationic residues on the PH domain, were observed, involving PIP interactions at either the "canonical" (C) and/or "alternate" (A) sites. PH domains can be grouped by the relative strength of their C- and A-site interactions, revealing that a higher affinity correlates with increased C-site interactions. These simulations demonstrate that simultaneous binding of multiple PIP molecules by PH domains contributes to high-affinity membrane interactions, informing our understanding of membrane recognition by PH domains in vivo. Copyright © 2017. Published by Elsevier Ltd.

  7. Time-domain seismic modeling in viscoelastic media for full waveform inversion on heterogeneous computing platforms with OpenCL

    Science.gov (United States)

    Fabien-Ouellet, Gabriel; Gloaguen, Erwan; Giroux, Bernard

    2017-03-01

    Full Waveform Inversion (FWI) aims at recovering the elastic parameters of the Earth by matching recordings of the ground motion with the direct solution of the wave equation. Modeling the wave propagation for realistic scenarios is computationally intensive, which limits the applicability of FWI. The current hardware evolution brings increasing parallel computing power that can speed up the computations in FWI. However, to take advantage of the diversity of parallel architectures presently available, new programming approaches are required. In this work, we explore the use of OpenCL to develop a portable code that can take advantage of the many parallel processor architectures now available. We present a program called SeisCL for 2D and 3D viscoelastic FWI in the time domain. The code computes the forward and adjoint wavefields using finite-difference and outputs the gradient of the misfit function given by the adjoint state method. To demonstrate the code portability on different architectures, the performance of SeisCL is tested on three different devices: Intel CPUs, NVidia GPUs and Intel Xeon PHI. Results show that the use of GPUs with OpenCL can speed up the computations by nearly two orders of magnitudes over a single threaded application on the CPU. Although OpenCL allows code portability, we show that some device-specific optimization is still required to get the best performance out of a specific architecture. Using OpenCL in conjunction with MPI allows the domain decomposition of large models on several devices located on different nodes of a cluster. For large enough models, the speedup of the domain decomposition varies quasi-linearly with the number of devices. Finally, we investigate two different approaches to compute the gradient by the adjoint state method and show the significant advantages of using OpenCL for FWI.

  8. $h - p$ Spectral element methods for elliptic problems on non-smooth domains using parallel computers

    NARCIS (Netherlands)

    Tomar, S.K.

    2002-01-01

    It is well known that elliptic problems when posed on non-smooth domains, develop singularities. We examine such problems within the framework of spectral element methods and resolve the singularities with exponential accuracy.

  9. Computational Identification of Genomic Features That Influence 3D Chromatin Domain Formation.

    Science.gov (United States)

    Mourad, Raphaël; Cuvier, Olivier

    2016-05-01

    Recent advances in long-range Hi-C contact mapping have revealed the importance of the 3D structure of chromosomes in gene expression. A current challenge is to identify the key molecular drivers of this 3D structure. Several genomic features, such as architectural proteins and functional elements, were shown to be enriched at topological domain borders using classical enrichment tests. Here we propose multiple logistic regression to identify those genomic features that positively or negatively influence domain border establishment or maintenance. The model is flexible, and can account for statistical interactions among multiple genomic features. Using both simulated and real data, we show that our model outperforms enrichment test and non-parametric models, such as random forests, for the identification of genomic features that influence domain borders. Using Drosophila Hi-C data at a very high resolution of 1 kb, our model suggests that, among architectural proteins, BEAF-32 and CP190 are the main positive drivers of 3D domain borders. In humans, our model identifies well-known architectural proteins CTCF and cohesin, as well as ZNF143 and Polycomb group proteins as positive drivers of domain borders. The model also reveals the existence of several negative drivers that counteract the presence of domain borders including P300, RXRA, BCL11A and ELK1.

  10. Language Choice and Use of Malaysian Public University Lecturers in the Education Domain

    Science.gov (United States)

    Mei, Tam Lee; Abdullah, Ain Nadzimah; Heng, Chan Swee; Kasim, Zalina Binti Mohd

    2016-01-01

    It is a norm for people from a multilingual and multicultural country such as Malaysia to speak at least two or more languages. Thus, the Malaysian multilingual situation resulted in speakers having to make decisions about which languages are to be used for different purposes in different domains. In order to explain the phenomenon of language…

  11. Domain Decomposition for Computing Extremely Low Frequency Induced Current in the Human Body

    OpenAIRE

    Perrussel , Ronan; Voyer , Damien; Nicolas , Laurent; Scorretti , Riccardo; Burais , Noël

    2011-01-01

    International audience; Computation of electromagnetic fields in high resolution computational phantoms requires solving large linear systems. We present an application of Schwarz preconditioners with Krylov subspace methods for computing extremely low frequency induced fields in a phantom issued from the Visible Human.

  12. A quantitative evaluation of the relative status of journal and conference publications in computer science.

    OpenAIRE

    Coyle, Lorcan; Freyne, Jill; Smyth, Barry; Cunningham, Padraig

    2010-01-01

    While it is universally held by computer scientists that conference publications have a higher status in computer science than in other disciplines there is little quantitative evidence in support of this position. The importance of journal publications in academic promotion makes this a big issue since an exclusive focus on journal papers will miss many significant papers published at conferences in computer science. In this paper we set out to quantify the relative importance of journ...

  13. Public policy and regulatory implications for the implementation of Opportunistic Cloud Computing Services for Enterprises

    DEFF Research Database (Denmark)

    Kuada, Eric; Olesen, Henning; Henten, Anders

    2012-01-01

    Opportunistic Cloud Computing Services (OCCS) is a social network approach to the provisioning and management of cloud computing services for enterprises. This paper discusses how public policy and regulations will impact on OCCS implementation. We rely on documented publicly available government...... and corporate policies on the adoption of cloud computing services and deduce the impact of these policies on their adoption of opportunistic cloud computing services. We conclude that there are regulatory challenges on data protection that raises issues for cloud computing adoption in general; and the lack...... of a single globally accepted data protection standard poses some challenges for very successful implementation of OCCS for companies. However, the direction of current public and corporate policies on cloud computing make a good case for them to try out opportunistic cloud computing services....

  14. Computer self efficacy as correlate of on-line public access ...

    African Journals Online (AJOL)

    The use of Online Public Access Catalogue (OPAC) by students has a lot of advantages and computer self-efficacy is a factor that could determine its effective utilization. Little appears to be known about colleges of education students‟ use of OPAC, computer self-efficacy and the relationship between OPAC and computer ...

  15. Rock Art and Radiance: Archaeology in the Public Domain as Life-Long Learning.

    Science.gov (United States)

    Ouzman, Sven

    The re-invigoration of storytelling in academic and public spheres allows rock art to offer opportunities to various publics, of which archaeologists are part. But how exactly this process of archaeology as lifelong learning is to proceed is not always clear, particularly in the United States. Until the last half decade of the twentieth century,…

  16. Design and development of semantic web-based system for computer science domain-specific information retrieval

    Directory of Open Access Journals (Sweden)

    Ritika Bansal

    2016-09-01

    Full Text Available In semantic web-based system, the concept of ontology is used to search results by contextual meaning of input query instead of keyword matching. From the research literature, there seems to be a need for a tool which can provide an easy interface for complex queries in natural language that can retrieve the domain-specific information from the ontology. This research paper proposes an IRSCSD system (Information retrieval system for computer science domain as a solution. This system offers advanced querying and browsing of structured data with search results automatically aggregated and rendered directly in a consistent user-interface, thus reducing the manual effort of users. So, the main objective of this research is design and development of semantic web-based system for integrating ontology towards domain-specific retrieval support. Methodology followed is a piecemeal research which involves the following stages. First Stage involves the designing of framework for semantic web-based system. Second stage builds the prototype for the framework using Protégé tool. Third Stage deals with the natural language query conversion into SPARQL query language using Python-based QUEPY framework. Fourth Stage involves firing of converted SPARQL queries to the ontology through Apache's Jena API to fetch the results. Lastly, evaluation of the prototype has been done in order to ensure its efficiency and usability. Thus, this research paper throws light on framework development for semantic web-based system that assists in efficient retrieval of domain-specific information, natural language query interpretation into semantic web language, creation of domain-specific ontology and its mapping with related ontology. This research paper also provides approaches and metrics for ontology evaluation on prototype ontology developed to study the performance based on accessibility of required domain-related information.

  17. Effects of clinically relevant MPL mutations in the transmembrane domain revealed at the atomic level through computational modeling.

    Science.gov (United States)

    Lee, Tai-Sung; Kantarjian, Hagop; Ma, Wanlong; Yeh, Chen-Hsiung; Giles, Francis; Albitar, Maher

    2011-01-01

    Mutations in the thrombopoietin receptor (MPL) may activate relevant pathways and lead to chronic myeloproliferative neoplasms (MPNs). The mechanisms of MPL activation remain elusive because of a lack of experimental structures. Modern computational biology techniques were utilized to explore the mechanisms of MPL protein activation due to various mutations. Transmembrane (TM) domain predictions, homology modeling, ab initio protein structure prediction, and molecular dynamics (MD) simulations were used to build structural dynamic models of wild-type and four clinically observed mutants of MPL. The simulation results suggest that S505 and W515 are important in keeping the TM domain in its correct position within the membrane. Mutations at either of these two positions cause movement of the TM domain, altering the conformation of the nearby intracellular domain in unexpected ways, and may cause the unwanted constitutive activation of MPL's kinase partner, JAK2. Our findings represent the first full-scale molecular dynamics simulations of the wild-type and clinically observed mutants of the MPL protein, a critical element of the MPL-JAK2-STAT signaling pathway. In contrast to usual explanations for the activation mechanism that are based on the relative translational movement between rigid domains of MPL, our results suggest that mutations within the TM region could result in conformational changes including tilt and rotation (azimuthal) angles along the membrane axis. Such changes may significantly alter the conformation of the adjacent and intrinsically flexible intracellular domain. Hence, caution should be exercised when interpreting experimental evidence based on rigid models of cytokine receptors or similar systems.

  18. General design methodology applied to the research domain of physical programming for computer illiterate

    CSIR Research Space (South Africa)

    Smith, Andrew C

    2011-09-01

    Full Text Available The authors discuss the application of the 'general design methodology‘ in the context of a physical computing project. The aim of the project was to design and develop physical objects that could serve as metaphors for computer programming elements...

  19. The Battle to Secure Our Public Access Computers

    Science.gov (United States)

    Sendze, Monique

    2006-01-01

    Securing public access workstations should be a significant part of any library's network and information-security strategy because of the sensitive information patrons enter on these workstations. As the IT manager for the Johnson County Library in Kansas City, Kan., this author is challenged to make sure that thousands of patrons get the access…

  20. Study of basic computer competence among public health nurses in Taiwan.

    Science.gov (United States)

    Yang, Kuei-Feng; Yu, Shu; Lin, Ming-Sheng; Hsu, Chia-Ling

    2004-03-01

    Rapid advances in information technology and media have made distance learning on the Internet possible. This new model of learning allows greater efficiency and flexibility in knowledge acquisition. Since basic computer competence is a prerequisite for this new learning model, this study was conducted to examine the basic computer competence of public health nurses in Taiwan and explore factors influencing computer competence. A national cross-sectional randomized study was conducted with 329 public health nurses. A questionnaire was used to collect data and was delivered by mail. Results indicate that basic computer competence of public health nurses in Taiwan is still needs to be improved (mean = 57.57 +- 2.83, total score range from 26-130). Among the five most frequently used software programs, nurses were most knowledgeable about Word and least knowledgeable about PowerPoint. Stepwise multiple regression analysis revealed eight variables (weekly number of hours spent online at home, weekly amount of time spent online at work, weekly frequency of computer use at work, previous computer training, computer at workplace and Internet access, job position, education level, and age) that significantly influenced computer competence, which accounted for 39.0 % of the variance. In conclusion, greater computer competence, broader educational programs regarding computer technology, and a greater emphasis on computers at work are necessary to increase the usefulness of distance learning via the Internet in Taiwan. Building a user-friendly environment is important in developing this new media model of learning for the future.

  1. Designing personal attentive user interfaces in the mobile public safety domain

    NARCIS (Netherlands)

    Streefkerk, J.W.; Esch van-Bussemakers, M.P.; Neerincx, M.A.

    2006-01-01

    In the mobile computing environment, there is a need to adapt the information and service provision to the momentary attentive state of the user, operational requirements and usage context. This paper proposes to design personal attentive user interfaces (PAUI) for which the content and style of

  2. Computational Study of Correlated Domain Motions in the AcrB Efflux Transporter

    Directory of Open Access Journals (Sweden)

    Robert Schulz

    2015-01-01

    Full Text Available As active part of the major efflux system in E. coli bacteria, AcrB is responsible for the uptake and pumping of toxic substrates from the periplasm toward the extracellular space. In combination with the channel protein TolC and membrane fusion protein AcrA, this efflux pump is able to help the bacterium to survive different kinds of noxious compounds. With the present study we intend to enhance the understanding of the interactions between the domains and monomers, for example, the transduction of mechanical energy from the transmembrane domain into the porter domain, correlated motions of different subdomains within monomers, and cooperative effects between monomers. To this end, targeted molecular dynamics simulations have been employed either steering the whole protein complex or specific parts thereof. By forcing only parts of the complex towards specific conformational states, the risk for transient artificial conformations during the simulations is reduced. Distinct cooperative effects between the monomers in AcrB have been observed. Possible allosteric couplings have been identified providing microscopic insights that might be exploited to design more efficient inhibitors of efflux systems.

  3. Integrating publicly-available data to generate computationally ...

    Science.gov (United States)

    The adverse outcome pathway (AOP) framework provides a way of organizing knowledge related to the key biological events that result in a particular health outcome. For the majority of environmental chemicals, the availability of curated pathways characterizing potential toxicity is limited. Methods are needed to assimilate large amounts of available molecular data and quickly generate putative AOPs for further testing and use in hazard assessment. A graph-based workflow was used to facilitate the integration of multiple data types to generate computationally-predicted (cp) AOPs. Edges between graph entities were identified through direct experimental or literature information or computationally inferred using frequent itemset mining. Data from the TG-GATEs and ToxCast programs were used to channel large-scale toxicogenomics information into a cpAOP network (cpAOPnet) of over 20,000 relationships describing connections between chemical treatments, phenotypes, and perturbed pathways measured by differential gene expression and high-throughput screening targets. Sub-networks of cpAOPs for a reference chemical (carbon tetrachloride, CCl4) and outcome (hepatic steatosis) were extracted using the network topology. Comparison of the cpAOP subnetworks to published mechanistic descriptions for both CCl4 toxicity and hepatic steatosis demonstrate that computational approaches can be used to replicate manually curated AOPs and identify pathway targets that lack genomic mar

  4. The politics of public domain : Ethical urbanism around marketplaces in London, Amsterdam & Istanbul

    NARCIS (Netherlands)

    Janssens, F.

    2017-01-01

    New marketplaces pop up every day in cities around the world. As catalysts for gentrification, however, these new marketplaces are often celebrated at the expense of the public markets that they replace. This replacement is symptomatic of today's urban governance, which is characterised by what I

  5. Collaborative Creativity: A Computational Approach: Raw Shaping Form Finding in Higher Education Domain

    NARCIS (Netherlands)

    Wendrich, Robert E.; Guerrero, J.E.

    2013-01-01

    This paper examines the conceptual synthesis processes in conjunction with assistive computational support for individual and collaborative interaction. We present findings from two educational design interaction experiments in product creation processing (PCP). We focus on metacognitive aspects of

  6. The Observation of Bahasa Indonesia Official Computer Terms Implementation in Scientific Publication

    Science.gov (United States)

    Gunawan, D.; Amalia, A.; Lydia, M. S.; Muthaqin, M. I.

    2018-03-01

    The government of the Republic of Indonesia had issued a regulation to substitute computer terms in foreign language that have been used earlier into official computer terms in Bahasa Indonesia. This regulation was stipulated in Presidential Decree No. 2 of 2001 concerning the introduction of official computer terms in Bahasa Indonesia (known as Senarai Padanan Istilah/SPI). After sixteen years, people of Indonesia, particularly for academics, should have implemented the official computer terms in their official publications. This observation is conducted to discover the implementation of official computer terms usage in scientific publications which are written in Bahasa Indonesia. The data source used in this observation are the publications by the academics, particularly in computer science field. The method used in the observation is divided into four stages. The first stage is metadata harvesting by using Open Archive Initiative - Protocol for Metadata Harvesting (OAI-PMH). Second, converting the harvested document (in pdf format) to plain text. The third stage is text-preprocessing as the preparation of string matching. Then the final stage is searching the official computer terms based on 629 SPI terms by using Boyer-Moore algorithm. We observed that there are 240,781 foreign computer terms in 1,156 scientific publications from six universities. This result shows that the foreign computer terms are still widely used by the academics.

  7. Computationally Efficient Amplitude Modulated Sinusoidal Audio Coding using Frequency-Domain Linear Prediction

    DEFF Research Database (Denmark)

    Christensen, M. G.; Jensen, Søren Holdt

    2006-01-01

    A method for amplitude modulated sinusoidal audio coding is presented that has low complexity and low delay. This is based on a subband processing system, where, in each subband, the signal is modeled as an amplitude modulated sum of sinusoids. The envelopes are estimated using frequency......-domain linear prediction and the prediction coefficients are quantized. As a proof of concept, we evaluate different configurations in a subjective listening test, and this shows that the proposed method offers significant improvements in sinusoidal coding. Furthermore, the properties of the frequency...

  8. Research foci of computing research in South Africa as reflected by publications in the South African computer journal

    CSIR Research Space (South Africa)

    Kotzé, P

    2009-01-01

    Full Text Available of research articles published in SACJ over its first 40 volumes of the journal using the ACM Computing Classification Scheme as basis. In their analysis the authors divided the publications into three cycles of more or less six years in order to identify...

  9. The Administrative Impact of Computers on the British Columbia Public School System.

    Science.gov (United States)

    Gibbens, Trevor P.

    This case study analyzes and evaluates the administrative computer systems in the British Columbia public school organization in order to investigate the costs and benefits of computers, their impact on managerial work, their influence on centralization in organizations, and the relationship between organizational objectives and the design of…

  10. The Impact of Social Computing on Public Services : a Rationale for Government 2 . 0

    NARCIS (Netherlands)

    Broek, Tijs Van Den; Frissen, Valerie; Huijboom, Noor; Punie, Yves

    2010-01-01

    In this article the impact of the fast emerging social computing trend on the public sector is explored. This exploration is based on the results of a study1 Prospective and Technological Studies (IPTS)2 commissioned by the Institute for . Three cases of social computing initiatives in diverse

  11. A Computer-Assisted Instruction in Teaching Abstract Statistics to Public Affairs Undergraduates

    Science.gov (United States)

    Ozturk, Ali Osman

    2012-01-01

    This article attempts to demonstrate the applicability of a computer-assisted instruction supported with simulated data in teaching abstract statistical concepts to political science and public affairs students in an introductory research methods course. The software is called the Elaboration Model Computer Exercise (EMCE) in that it takes a great…

  12. Unbound motion on a Schwarzschild background: Practical approaches to frequency domain computations

    Science.gov (United States)

    Hopper, Seth

    2018-03-01

    Gravitational perturbations due to a point particle moving on a static black hole background are naturally described in Regge-Wheeler gauge. The first-order field equations reduce to a single master wave equation for each radiative mode. The master function satisfying this wave equation is a linear combination of the metric perturbation amplitudes with a source term arising from the stress-energy tensor of the point particle. The original master functions were found by Regge and Wheeler (odd parity) and Zerilli (even parity). Subsequent work by Moncrief and then Cunningham, Price and Moncrief introduced new master variables which allow time domain reconstruction of the metric perturbation amplitudes. Here, I explore the relationship between these different functions and develop a general procedure for deriving new higher-order master functions from ones already known. The benefit of higher-order functions is that their source terms always converge faster at large distance than their lower-order counterparts. This makes for a dramatic improvement in both the speed and accuracy of frequency domain codes when analyzing unbound motion.

  13. Computational Design of High-χ Block Oligomers for Accessing 1 nm Domains.

    Science.gov (United States)

    Chen, Qile P; Barreda, Leonel; Oquendo, Luis E; Hillmyer, Marc A; Lodge, Timothy P; Siepmann, J Ilja

    2018-05-22

    Molecular dynamics simulations are used to design a series of high-χ block oligomers (HCBOs) that can self-assemble into a variety of mesophases with domain sizes as small as 1 nm. The exploration of these oligomers with various chain lengths, volume fractions, and chain architectures at multiple temperatures reveals the presence of ordered lamellae, perforated lamellae, and hexagonally packed cylinders. The achieved periods are as small as 3.0 and 2.1 nm for lamellae and cylinders, respectively, which correspond to polar domains of approximately 1 nm. Interestingly, the detailed phase behavior of these oligomers is distinct from that of either solvent-free surfactants or block polymers. The simulations reveal that the behavior of these HCBOs is a product of an interplay between both "surfactant factors" (headgroup interactions, chain flexibility, and interfacial curvature) and "block polymer factors" (χ, chain length N, and volume fraction f). This insight promotes the understanding of molecular features pivotal for mesophase formation at the sub-5 nm length scale, which facilitates the design of HCBOs tailored toward particular desired morphologies.

  14. Telling and measuring urban floods: event reconstruction by means of public-domain media

    Science.gov (United States)

    Macchia, S.; Gallo, E.; Claps, P.

    2012-04-01

    In the last decade, the diffusion of mobile telephones and ond of low-cost digital cameras have changed the public approach to catastrophes. As regards floods, it has become widespread the availability of images and videos taken in urban areas. Searching into Youtube or Youreporter, for example, one can understand how often citizen are considering to report even scary events. Nowadays these amateurs videos are often used in news world reports, which often increase or dampen the public perception of flood risk. More importantly, these amateur videos can play a crucial role in a didactic and technical representation of media flooding problems. The question so arise: why don't use the amateur videos for civil protection purposes? This work shows a new way to use flood images and videos to obtain technical data and spread safety information. Specifically, we show how to determine the height and speed of water flow, which have been achieved in some places during Genoa flood - 4th November 2011 - For this event we have downloaded more than 50 videos from different websites, where the authors have provided information about the time of recording, the geographical coordinates and the height above ground of the point of recording. The support by Google tools, such as Google maps and StreetWiew © has allowed us to geographically locate the recording points, so to put together shots and slides necessary to put together a whole reconstruction of the event. Future research will be in the direction of using these videos to generate a tool for the Google platforms, in order to address an easily achievable, yet accurate, information to the public, so to warn people on how to behave in front of imminent floods.

  15. Computational Ecology and Software (http://www.iaees.org/publications/journals/ces/online-version.asp

    Directory of Open Access Journals (Sweden)

    ces@iaees.org

    Full Text Available Computational Ecology and Software ISSN 2220-721X URL: http://www.iaees.org/publications/journals/ces/online-version.asp RSS: http://www.iaees.org/publications/journals/ces/rss.xml E-mail: ces@iaees.org Editor-in-Chief: WenJun Zhang Aims and Scope COMPUTATIONAL ECOLOGY AND SOFTWARE (ISSN 2220-721X is an open access, peer-reviewed online journal that considers scientific articles in all different areas of computational ecology. It is the transactions of the International Society of Computational Ecology. The journal is concerned with the ecological researches, constructions and applications of theories and methods of computational sciences including computational mathematics, computational statistics and computer science. It features the simulation, approximation, prediction, recognition, and classification of ecological issues. Intensive computation is one of the major stresses of the journal. The journal welcomes research articles, short communications, review articles, perspectives, and book reviews. The journal also supports the activities of the International Society of Computational Ecology. The topics to be covered by CES include, but are not limited to: •Computation intensive methods, numerical and optimization methods, differential and difference equation modeling and simulation, prediction, recognition, classification, statistical computation (Bayesian computing, randomization, bootstrapping, Monte Carlo techniques, stochastic process, etc., agent-based modeling, individual-based modeling, artificial neural networks, knowledge based systems, machine learning, genetic algorithms, data exploration, network analysis and computation, databases, ecological modeling and computation using Geographical Information Systems, satellite imagery, and other computation intensive theories and methods. •Artificial ecosystems, artificial life, complexity of ecosystems and virtual reality. •The development, evaluation and validation of software and

  16. Using Application-Domain Knowledge in the Runtime Support of Multi-Experiment Computational Studies

    Science.gov (United States)

    2009-01-01

    259, 1993. [20] Carlos A Coello, David A Van Veldhuizen , and Gary B Lamont. Evolu- tionary Algorithms for Solving Multi-Objective Problems. Kluwer...Applications, 20(2):255–285, 2006. [56] Robert Van Liere, Jurriaan D. Mulder, and Jarke J. van Wijk. Compu- tational steering. Future Generation Computer

  17. Public Services 2.0: The Impact of Social Computing on Public Services

    OpenAIRE

    Punie, Y.; Misuraca, G.; Osimo, D.; Huijboom, N.; Broek, T.A. van den; Frissen, V.; Kool, L.

    2010-01-01

    Since 2003, the Internet has seen impressive growth in user-driven applications such as blogs, podcasts, wikis and social networking sites. This trend is referred to here as ‘social computing’ as online applications increasingly support the creation of value by social networks of people. The social computing trend has been recognised and monitored by the Institute for Prospective and Technological Studies (IPTS) over the past few years. IPTS observed a viral take up of social computing applic...

  18. Polyhedral meshing as an innovative approach to computational domain discretization of a cyclone in a fluidized bed CLC unit

    Directory of Open Access Journals (Sweden)

    Sosnowski Marcin

    2017-01-01

    Full Text Available Chemical Looping Combustion (CLC is a technology that allows the separation of CO2, which is generated by the combustion of fossil fuels. The majority of process designs currently under investigation are systems of coupled fluidized beds. Advances in the development of power generation system using CLC cannot be introduced without using numerical modelling as a research tool. The primary and critical activity in numerical modelling is the computational domain discretization. It influences the numerical diffusion as well as convergence of the model and therefore the overall accuracy of the obtained results. Hence an innovative approach of computational domain discretization using polyhedral (POLY mesh is proposed in the paper. This method reduces both the numerical diffusion of the mesh as well as the time cost of preparing the model for subsequent calculation. The major advantage of POLY mesh is that each individual cell has many neighbours, so gradients can be much better approximated in comparison to commonly-used tetrahedral (TET mesh. POLYs are also less sensitive to stretching than TETs which results in better numerical stability of the model. Therefore detailed comparison of numerical modelling results concerning subsection of CLC system using tetrahedral and polyhedral mesh is covered in the paper.

  19. Computer simulation of temperature-dependent growth of fractal and compact domains in diluted Ising models

    DEFF Research Database (Denmark)

    Sørensen, Erik Schwartz; Fogedby, Hans C.; Mouritsen, Ole G.

    1989-01-01

    temperature are studied as functions of temperature, time, and concentration. At zero temperature and high dilution, the growing solid is found to have a fractal morphology and the effective fractal exponent D varies with concentration and ratio of time scales of the two dynamical processes. The mechanism...... responsible for forming the fractal solid is shown to be a buildup of a locally high vacancy concentration in the active growth zone. The growth-probability measure of the fractals is analyzed in terms of multifractality by calculating the f(α) spectrum. It is shown that the basic ideas of relating...... probability measures of static fractal objects to the growth-probability distribution during formation of the fractal apply to the present model. The f(α) spectrum is found to be in the universality class of diffusion-limited aggregation. At finite temperatures, the fractal solid domains become metastable...

  20. Data Prediction for Public Events in Professional Domains Based on Improved RNN- LSTM

    Science.gov (United States)

    Song, Bonan; Fan, Chunxiao; Wu, Yuexin; Sun, Juanjuan

    2018-02-01

    The traditional data services of prediction for emergency or non-periodic events usually cannot generate satisfying result or fulfill the correct prediction purpose. However, these events are influenced by external causes, which mean certain a priori information of these events generally can be collected through the Internet. This paper studied the above problems and proposed an improved model—LSTM (Long Short-term Memory) dynamic prediction and a priori information sequence generation model by combining RNN-LSTM and public events a priori information. In prediction tasks, the model is qualified for determining trends, and its accuracy also is validated. This model generates a better performance and prediction results than the previous one. Using a priori information can increase the accuracy of prediction; LSTM can better adapt to the changes of time sequence; LSTM can be widely applied to the same type of prediction tasks, and other prediction tasks related to time sequence.

  1. Luminescence dose reconstruction using personal objects and material available in the environment and public domain

    International Nuclear Information System (INIS)

    Goeksu, H. Y.

    2006-01-01

    There is a growing concern in the public about accidental radiation exposures due to the ageing of the nuclear power industry, illegal dumping of nuclear waste, or terrorist activities which may increase the health risks to individuals or large numbers of public. In cases where no direct radiation monitoring data are available, luminescence dose reconstruction obtained using material from the immediate environment of population or persons can be used to validate values obtained from the numerical simulations. In recent years, especially after the Chernobyl accident, techniques and methodology of luminescence dose reconstruction using fired building material have advanced to such an extent that radiation from anthropogenic sources as low as 10 mGy can be resolved within two years after the event. It was demonstrated that luminescence measurements using bricks combined with Monte-Carlo simulations of photon transport for a given source geometry and distribution can provide quantities to derive doses for populations or groups of people living in contaminated areas that can be used for epidemiological studies. In this presentation, recent approaches in luminescence dose reconstruction using un-fired building materials such as concrete and silica brick will be reviewed and the possible use of personal artefacts such as telephone chip cards or prosthetic and restorative teeth will be discussed. The review will include the results of the joint efforts of the international team supported by the EU at the Chernobyl affected territories, areas affected due to activities of Plutonium production facilities in Southern Urals (Russia), and settlements around the Semipalatinsk nuclear bomb test sites

  2. Solving Problems in Various Domains by Hybrid Models of High Performance Computations

    Directory of Open Access Journals (Sweden)

    Yurii Rogozhin

    2014-03-01

    Full Text Available This work presents a hybrid model of high performance computations. The model is based on membrane system (P~system where some membranes may contain quantum device that is triggered by the data entering the membrane. This model is supposed to take advantages of both biomolecular and quantum paradigms and to overcome some of their inherent limitations. The proposed approach is demonstrated through two selected problems: SAT, and image retrieving.

  3. Psychometric characteristics of a public-domain self-report measure of vocational interests: the Oregon Vocational Interest Scales.

    Science.gov (United States)

    Pozzebon, Julie A; Visser, Beth A; Ashton, Michael C; Lee, Kibeom; Goldberg, Lewis R

    2010-03-01

    We investigated the psychometric properties of the Oregon Vocational Interest Scales (ORVIS), a brief public-domain alternative to commercial inventories, in a large community sample and in a college sample. In both samples, we examined the factor structure, scale intercorrelations, and personality correlates of the ORVIS, and in the community sample, we also examined the correlations of the ORVIS scales with cognitive abilities and with the scales of a longer, proprietary interest survey. In both samples, all 8 scales-Leadership, Organization, Altruism, Creativity, Analysis, Producing, Adventuring, and Erudition-showed wide variation in scores, high internal-consistency reliabilities, and a pattern of high convergent and low discriminant correlations with the scales of the proprietary interest survey. Overall, the results support the construct validity of the scales, which are recommended for use in research on vocational interests and other individual differences.

  4. Evolution of Industry Knowledge in the Public Domain: Prior Art Searching for Software Patents

    Directory of Open Access Journals (Sweden)

    Jinseok Park

    2005-03-01

    Full Text Available Searching prior art is a key part of the patent application and examination processes. A comprehensive prior art search gives the inventor ideas as to how he can improve or circumvent existing technology by providing up to date knowledge on the state of the art. It also enables the patent applicant to minimise the likelihood of an objection from the patent office. This article explores the characteristics of prior art associated with software patents, dealing with difficulties in searching prior art due to the lack of resources, and considers public contribution to the formation of prior art databases. It addresses the evolution of electronic prior art in line with technological development, and discusses laws and practices in the EPO, USPTO, and the JPO in relation to the validity of prior art resources on the Internet. This article also investigates the main features of searching sources and tools in the three patent offices as well as non-patent literature databases. Based on the analysis of various searching databases, it provides some strategies of efficient prior art searching that should be considered for software-related inventions.

  5. The Importance of Computer Science for Public Health Training: An Opportunity and Call to Action.

    Science.gov (United States)

    Kunkle, Sarah; Christie, Gillian; Yach, Derek; El-Sayed, Abdulrahman M

    2016-01-01

    A century ago, the Welch-Rose Report established a public health education system in the United States. Since then, the system has evolved to address emerging health needs and integrate new technologies. Today, personalized health technologies generate large amounts of data. Emerging computer science techniques, such as machine learning, present an opportunity to extract insights from these data that could help identify high-risk individuals and tailor health interventions and recommendations. As these technologies play a larger role in health promotion, collaboration between the public health and technology communities will become the norm. Offering public health trainees coursework in computer science alongside traditional public health disciplines will facilitate this evolution, improving public health's capacity to harness these technologies to improve population health.

  6. Computational domain length and Reynolds number effects on large-scale coherent motions in turbulent pipe flow

    Science.gov (United States)

    Feldmann, Daniel; Bauer, Christian; Wagner, Claus

    2018-03-01

    We present results from direct numerical simulations (DNS) of turbulent pipe flow at shear Reynolds numbers up to Reτ = 1500 using different computational domains with lengths up to ?. The objectives are to analyse the effect of the finite size of the periodic pipe domain on large flow structures in dependency of Reτ and to assess a minimum ? required for relevant turbulent scales to be captured and a minimum Reτ for very large-scale motions (VLSM) to be analysed. Analysing one-point statistics revealed that the mean velocity profile is invariant for ?. The wall-normal location at which deviations occur in shorter domains changes strongly with increasing Reτ from the near-wall region to the outer layer, where VLSM are believed to live. The root mean square velocity profiles exhibit domain length dependencies for pipes shorter than 14R and 7R depending on Reτ. For all Reτ, the higher-order statistical moments show only weak dependencies and only for the shortest domain considered here. However, the analysis of one- and two-dimensional pre-multiplied energy spectra revealed that even for larger ?, not all physically relevant scales are fully captured, even though the aforementioned statistics are in good agreement with the literature. We found ? to be sufficiently large to capture VLSM-relevant turbulent scales in the considered range of Reτ based on our definition of an integral energy threshold of 10%. The requirement to capture at least 1/10 of the global maximum energy level is justified by a 14% increase of the streamwise turbulence intensity in the outer region between Reτ = 720 and 1500, which can be related to VLSM-relevant length scales. Based on this scaling anomaly, we found Reτ⪆1500 to be a necessary minimum requirement to investigate VLSM-related effects in pipe flow, even though the streamwise energy spectra does not yet indicate sufficient scale separation between the most energetic and the very long motions.

  7. Monitoring Urban Tree Cover Using Object-Based Image Analysis and Public Domain Remotely Sensed Data

    Directory of Open Access Journals (Sweden)

    Meghan Halabisky

    2011-10-01

    Full Text Available Urban forest ecosystems provide a range of social and ecological services, but due to the heterogeneity of these canopies their spatial extent is difficult to quantify and monitor. Traditional per-pixel classification methods have been used to map urban canopies, however, such techniques are not generally appropriate for assessing these highly variable landscapes. Landsat imagery has historically been used for per-pixel driven land use/land cover (LULC classifications, but the spatial resolution limits our ability to map small urban features. In such cases, hyperspatial resolution imagery such as aerial or satellite imagery with a resolution of 1 meter or below is preferred. Object-based image analysis (OBIA allows for use of additional variables such as texture, shape, context, and other cognitive information provided by the image analyst to segment and classify image features, and thus, improve classifications. As part of this research we created LULC classifications for a pilot study area in Seattle, WA, USA, using OBIA techniques and freely available public aerial photography. We analyzed the differences in accuracies which can be achieved with OBIA using multispectral and true-color imagery. We also compared our results to a satellite based OBIA LULC and discussed the implications of per-pixel driven vs. OBIA-driven field sampling campaigns. We demonstrated that the OBIA approach can generate good and repeatable LULC classifications suitable for tree cover assessment in urban areas. Another important finding is that spectral content appeared to be more important than spatial detail of hyperspatial data when it comes to an OBIA-driven LULC.

  8. Multispectral medical image fusion in Contourlet domain for computer based diagnosis of Alzheimer’s disease

    International Nuclear Information System (INIS)

    Bhateja, Vikrant; Moin, Aisha; Srivastava, Anuja; Bao, Le Nguyen; Lay-Ekuakille, Aimé; Le, Dac-Nhuong

    2016-01-01

    Computer based diagnosis of Alzheimer’s disease can be performed by dint of the analysis of the functional and structural changes in the brain. Multispectral image fusion deliberates upon fusion of the complementary information while discarding the surplus information to achieve a solitary image which encloses both spatial and spectral details. This paper presents a Non-Sub-sampled Contourlet Transform (NSCT) based multispectral image fusion model for computer-aided diagnosis of Alzheimer’s disease. The proposed fusion methodology involves color transformation of the input multispectral image. The multispectral image in YIQ color space is decomposed using NSCT followed by dimensionality reduction using modified Principal Component Analysis algorithm on the low frequency coefficients. Further, the high frequency coefficients are enhanced using non-linear enhancement function. Two different fusion rules are then applied to the low-pass and high-pass sub-bands: Phase congruency is applied to low frequency coefficients and a combination of directive contrast and normalized Shannon entropy is applied to high frequency coefficients. The superiority of the fusion response is depicted by the comparisons made with the other state-of-the-art fusion approaches (in terms of various fusion metrics).

  9. Multispectral medical image fusion in Contourlet domain for computer based diagnosis of Alzheimer’s disease

    Energy Technology Data Exchange (ETDEWEB)

    Bhateja, Vikrant, E-mail: bhateja.vikrant@gmail.com, E-mail: nhuongld@hus.edu.vn; Moin, Aisha; Srivastava, Anuja [Shri Ramswaroop Memorial Group of Professional Colleges (SRMGPC), Lucknow, Uttar Pradesh 226028 (India); Bao, Le Nguyen [Duytan University, Danang 550000 (Viet Nam); Lay-Ekuakille, Aimé [Department of Innovation Engineering, University of Salento, Lecce 73100 (Italy); Le, Dac-Nhuong, E-mail: bhateja.vikrant@gmail.com, E-mail: nhuongld@hus.edu.vn [Duytan University, Danang 550000 (Viet Nam); Haiphong University, Haiphong 180000 (Viet Nam)

    2016-07-15

    Computer based diagnosis of Alzheimer’s disease can be performed by dint of the analysis of the functional and structural changes in the brain. Multispectral image fusion deliberates upon fusion of the complementary information while discarding the surplus information to achieve a solitary image which encloses both spatial and spectral details. This paper presents a Non-Sub-sampled Contourlet Transform (NSCT) based multispectral image fusion model for computer-aided diagnosis of Alzheimer’s disease. The proposed fusion methodology involves color transformation of the input multispectral image. The multispectral image in YIQ color space is decomposed using NSCT followed by dimensionality reduction using modified Principal Component Analysis algorithm on the low frequency coefficients. Further, the high frequency coefficients are enhanced using non-linear enhancement function. Two different fusion rules are then applied to the low-pass and high-pass sub-bands: Phase congruency is applied to low frequency coefficients and a combination of directive contrast and normalized Shannon entropy is applied to high frequency coefficients. The superiority of the fusion response is depicted by the comparisons made with the other state-of-the-art fusion approaches (in terms of various fusion metrics).

  10. A novel image-domain-based cone-beam computed tomography enhancement algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Li Xiang; Li Tianfang; Yang Yong; Heron, Dwight E; Huq, M Saiful, E-mail: lix@upmc.edu [Department of Radiation Oncology, University of Pittsburgh Cancer Institute, Pittsburgh, PA 15232 (United States)

    2011-05-07

    Kilo-voltage (kV) cone-beam computed tomography (CBCT) plays an important role in image-guided radiotherapy. However, due to a large cone-beam angle, scatter effects significantly degrade the CBCT image quality and limit its clinical application. The goal of this study is to develop an image enhancement algorithm to reduce the low-frequency CBCT image artifacts, which are also called the bias field. The proposed algorithm is based on the hypothesis that image intensities of different types of materials in CBCT images are approximately globally uniform (in other words, a piecewise property). A maximum a posteriori probability framework was developed to estimate the bias field contribution from a given CBCT image. The performance of the proposed CBCT image enhancement method was tested using phantoms and clinical CBCT images. Compared to the original CBCT images, the corrected images using the proposed method achieved a more uniform intensity distribution within each tissue type and significantly reduced cupping and shading artifacts. In a head and a pelvic case, the proposed method reduced the Hounsfield unit (HU) errors within the region of interest from 300 HU to less than 60 HU. In a chest case, the HU errors were reduced from 460 HU to less than 110 HU. The proposed CBCT image enhancement algorithm demonstrated a promising result by the reduction of the scatter-induced low-frequency image artifacts commonly encountered in kV CBCT imaging.

  11. Secure encapsulation and publication of biological services in the cloud computing environment.

    Science.gov (United States)

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  12. 76 FR 67418 - Request for Comments on NIST Special Publication 500-293, US Government Cloud Computing...

    Science.gov (United States)

    2011-11-01

    ...-1659-01] Request for Comments on NIST Special Publication 500-293, US Government Cloud Computing... Publication 500-293, US Government Cloud Computing Technology Roadmap, Release 1.0 (Draft). This document is... (USG) agencies to accelerate their adoption of cloud computing. The roadmap has been developed through...

  13. A semantic-based method for extracting concept definitions from scientific publications: evaluation in the autism phenotype domain.

    Science.gov (United States)

    Hassanpour, Saeed; O'Connor, Martin J; Das, Amar K

    2013-08-12

    A variety of informatics approaches have been developed that use information retrieval, NLP and text-mining techniques to identify biomedical concepts and relations within scientific publications or their sentences. These approaches have not typically addressed the challenge of extracting more complex knowledge such as biomedical definitions. In our efforts to facilitate knowledge acquisition of rule-based definitions of autism phenotypes, we have developed a novel semantic-based text-mining approach that can automatically identify such definitions within text. Using an existing knowledge base of 156 autism phenotype definitions and an annotated corpus of 26 source articles containing such definitions, we evaluated and compared the average rank of correctly identified rule definition or corresponding rule template using both our semantic-based approach and a standard term-based approach. We examined three separate scenarios: (1) the snippet of text contained a definition already in the knowledge base; (2) the snippet contained an alternative definition for a concept in the knowledge base; and (3) the snippet contained a definition not in the knowledge base. Our semantic-based approach had a higher average rank than the term-based approach for each of the three scenarios (scenario 1: 3.8 vs. 5.0; scenario 2: 2.8 vs. 4.9; and scenario 3: 4.5 vs. 6.2), with each comparison significant at the p-value of 0.05 using the Wilcoxon signed-rank test. Our work shows that leveraging existing domain knowledge in the information extraction of biomedical definitions significantly improves the correct identification of such knowledge within sentences. Our method can thus help researchers rapidly acquire knowledge about biomedical definitions that are specified and evolving within an ever-growing corpus of scientific publications.

  14. Use of media and public-domain Internet sources for detection and assessment of plant health threats.

    Science.gov (United States)

    Thomas, Carla S; Nelson, Noele P; Jahn, Gary C; Niu, Tianchan; Hartley, David M

    2011-09-05

    Event-based biosurveillance is a recognized approach to early warning and situational awareness of emerging health threats. In this study, we build upon previous human and animal health work to develop a new approach to plant pest and pathogen surveillance. We show that monitoring public domain electronic media for indications and warning of epidemics and associated social disruption can provide information about the emergence and progression of plant pest infestation or disease outbreak. The approach is illustrated using a case study, which describes a plant pest and pathogen epidemic in China and Vietnam from February 2006 to December 2007, and the role of ducks in contributing to zoonotic virus spread in birds and humans. This approach could be used as a complementary method to traditional plant pest and pathogen surveillance to aid global and national plant protection officials and political leaders in early detection and timely response to significant biological threats to plant health, economic vitality, and social stability. This study documents the inter-relatedness of health in human, animal, and plant populations and emphasizes the importance of plant health surveillance.

  15. Selected ICAR Data from the SAPA-Project: Development and Initial Validation of a Public-Domain Measure

    Directory of Open Access Journals (Sweden)

    David M. Condon

    2016-01-01

    Full Text Available These data were collected during the initial evaluation of the International Cognitive Ability Resource (ICAR project. ICAR is an international collaborative effort to develop open-source public-domain tools for cognitive ability assessment, including tools that can be administered in non-proctored environments (e.g., online administration and those which are based on automatic item generation algorithms. These data provide initial validation of the first four ICAR item types as reported in Condon & Revelle [1]. The 4 item types contain a total of 60 items: 9 Letter and Number Series items, 11 Matrix Reasoning items, 16 Verbal Reasoning items and 24 Three-dimensional Rotation items. Approximately 97,000 individuals were administered random subsets of these 60 items using the Synthetic Aperture Personality Assessment method between August 18, 2010 and May 20, 2013. The data are available in rdata and csv formats and are accompanied by documentation stored as a text file. Re-use potential includes a wide range of structural and item-level analyses.

  16. The effect of finite-difference time-domain resolution and power-loss computation method on SAR values in plane-wave exposure of Zubal phantom

    International Nuclear Information System (INIS)

    Uusitupa, T M; Ilvonen, S A; Laakso, I M; Nikoskinen, K I

    2008-01-01

    In this paper, the anatomically realistic body model Zubal is exposed to a plane wave. A finite-difference time-domain (FDTD) method is used to obtain field data for specific-absorption-rate (SAR) computation. It is investigated how the FDTD resolution, power-loss computation method and positioning of the material voxels in the FDTD grid affect the SAR results. The results enable one to estimate the effects due to certain fundamental choices made in the SAR simulation

  17. Domain decomposition for the computation of radiosity in lighting simulation; Decomposition de domaines pour le calcul de la radiosite en simulation d'eclairage

    Energy Technology Data Exchange (ETDEWEB)

    Salque, B

    1998-07-01

    This work deals with the equation of radiosity, this equation describes the transport of light energy through a diffuse medium, its resolution enables us to simulate the presence of light sources. The equation of radiosity is an integral equation who admits a unique solution in realistic cases. The different methods of solving are reviewed. The equation of radiosity can not be formulated as the integral form of a classical partial differential equation, but this work shows that the technique of domain decomposition can be successfully applied to the equation of radiosity if this approach is framed by considerations of physics. This method provides a system of independent equations valid for each sub-domain and whose main parameter is luminance. Some numerical examples give an idea of the convergence of the algorithm. This method is applied to the optimization of the shape of a light reflector.

  18. A user-friendly SSVEP-based brain-computer interface using a time-domain classifier.

    Science.gov (United States)

    Luo, An; Sullivan, Thomas J

    2010-04-01

    We introduce a user-friendly steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI) system. Single-channel EEG is recorded using a low-noise dry electrode. Compared to traditional gel-based multi-sensor EEG systems, a dry sensor proves to be more convenient, comfortable and cost effective. A hardware system was built that displays four LED light panels flashing at different frequencies and synchronizes with EEG acquisition. The visual stimuli have been carefully designed such that potential risk to photosensitive people is minimized. We describe a novel stimulus-locked inter-trace correlation (SLIC) method for SSVEP classification using EEG time-locked to stimulus onsets. We studied how the performance of the algorithm is affected by different selection of parameters. Using the SLIC method, the average light detection rate is 75.8% with very low error rates (an 8.4% false positive rate and a 1.3% misclassification rate). Compared to a traditional frequency-domain-based method, the SLIC method is more robust (resulting in less annoyance to the users) and is also suitable for irregular stimulus patterns.

  19. Exploring the Deep-Level Reasoning Questions Effect during Vicarious Learning among Eighth to Eleventh Graders in the Domains of Computer Literacy and Newtonian Physics

    Science.gov (United States)

    Gholson, Barry; Witherspoon, Amy; Morgan, Brent; Brittingham, Joshua K.; Coles, Robert; Graesser, Arthur C.; Sullins, Jeremiah; Craig, Scotty D.

    2009-01-01

    This paper tested the deep-level reasoning questions effect in the domains of computer literacy between eighth and tenth graders and Newtonian physics for ninth and eleventh graders. This effect claims that learning is facilitated when the materials are organized around questions that invite deep-reasoning. The literature indicates that vicarious…

  20. Leveraging Cloud Computing to Address Public Health Disparities: An Analysis of the SPHPS.

    Science.gov (United States)

    Jalali, Arash; Olabode, Olusegun A; Bell, Christopher M

    2012-01-01

    As the use of certified electronic health record technology (CEHRT) has continued to gain prominence in hospitals and physician practices, public health agencies and health professionals have the ability to access health data through health information exchanges (HIE). With such knowledge health providers are well positioned to positively affect population health, and enhance health status or quality-of-life outcomes in at-risk populations. Through big data analytics, predictive analytics and cloud computing, public health agencies have the opportunity to observe emerging public health threats in real-time and provide more effective interventions addressing health disparities in our communities. The Smarter Public Health Prevention System (SPHPS) provides real-time reporting of potential public health threats to public health leaders through the use of a simple and efficient dashboard and links people with needed personal health services through mobile platforms for smartphones and tablets to promote and encourage healthy behaviors in our communities. The purpose of this working paper is to evaluate how a secure virtual private cloud (VPC) solution could facilitate the implementation of the SPHPS in order to address public health disparities.

  1. Computational Science: Ensuring America's Competitiveness

    National Research Council Canada - National Science Library

    Reed, Daniel A; Bajcsy, Ruzena; Fernandez, Manuel A; Griffiths, Jose-Marie; Mott, Randall D; Dongarra, J. J; Johnson, Chris R; Inouye, Alan S; Miner, William; Matzke, Martha K; Ponick, Terry L

    2005-01-01

    Computational science is now indispensable to the solution of complex problems in every sector, from traditional science and engineering domains to such key areas as national security, public health...

  2. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    Science.gov (United States)

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved. PMID:24078906

  3. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Weizhe Zhang

    2013-01-01

    Full Text Available Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  4. Informed public choices for low-carbon electricity portfolios using a computer decision tool.

    Science.gov (United States)

    Mayer, Lauren A Fleishman; Bruine de Bruin, Wändi; Morgan, M Granger

    2014-04-01

    Reducing CO2 emissions from the electricity sector will likely require policies that encourage the widespread deployment of a diverse mix of low-carbon electricity generation technologies. Public discourse informs such policies. To make informed decisions and to productively engage in public discourse, citizens need to understand the trade-offs between electricity technologies proposed for widespread deployment. Building on previous paper-and-pencil studies, we developed a computer tool that aimed to help nonexperts make informed decisions about the challenges faced in achieving a low-carbon energy future. We report on an initial usability study of this interactive computer tool. After providing participants with comparative and balanced information about 10 electricity technologies, we asked them to design a low-carbon electricity portfolio. Participants used the interactive computer tool, which constrained portfolio designs to be realistic and yield low CO2 emissions. As they changed their portfolios, the tool updated information about projected CO2 emissions, electricity costs, and specific environmental impacts. As in the previous paper-and-pencil studies, most participants designed diverse portfolios that included energy efficiency, nuclear, coal with carbon capture and sequestration, natural gas, and wind. Our results suggest that participants understood the tool and used it consistently. The tool may be downloaded from http://cedmcenter.org/tools-for-cedm/informing-the-public-about-low-carbon-technologies/ .

  5. 6th International Workshop on Computer-Aided Scheduling of Public Transport

    CERN Document Server

    Branco, Isabel; Paixão, José

    1995-01-01

    This proceedings volume consists of papers presented at the Sixth International Workshop on Computer-Aided Scheduling of Public Transpon, which was held at the Fund~lio Calouste Gulbenkian in Lisbon from July 6th to 9th, 1993. In the tradition of alternating Workshops between North America and Europe - Chicago (1975), Leeds (1980), Montreal (1983), Hamburg (1987) and again Montreal (1990), the European city of Lisbon was selected as the venue for the Workshop in 1993. As in earlier Workshops, the central theme dealt with vehicle and duty scheduling problems and the employment of operations-research-based software systems for operational planning in public transport. However, as was initiated in Hamburg in 1987, the scope of this Workshop was broadened to include topics in related fields. This fundamental alteration was an inevitable consequence of the growing demand over the last decade for solutions to the complete planning process in public transport through integrated systems. Therefore, the program of thi...

  6. 76 FR 12397 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Bureau of the Public Debt (BPD...

    Science.gov (United States)

    2011-03-07

    ...; Computer Matching Program (SSA/ Bureau of the Public Debt (BPD))--Match Number 1038 AGENCY: Social Security... as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection... containing SSNs extracted from the Supplemental Security Record database. Exchanges for this computer...

  7. Domain decomposition with local refinement for flow simulation around a nuclear waste disposal site: direct computation versus simulation using code coupling with OCamlP3L

    Energy Technology Data Exchange (ETDEWEB)

    Clement, F.; Vodicka, A.; Weis, P. [Institut National de Recherches Agronomiques (INRA), 78 - Le Chesnay (France); Martin, V. [Institut National de Recherches Agronomiques (INRA), 92 - Chetenay Malabry (France); Di Cosmo, R. [Institut National de Recherches Agronomiques (INRA), 78 - Le Chesnay (France); Paris-7 Univ., 75 (France)

    2003-07-01

    We consider the application of a non-overlapping domain decomposition method with non-matching grids based on Robin interface conditions to the problem of flow surrounding an underground nuclear waste disposal. We show with a simple example how one can refine the mesh locally around the storage with this technique. A second aspect is studied in this paper. The coupling between the sub-domains can be achieved by computing in two ways: either directly (i.e. the domain decomposition algorithm is included in the code that solves the problems on the sub-domains) or using code coupling. In the latter case, each sub-domain problem is solved separately and the coupling is performed by another program. We wrote a coupling program in the functional language Ocaml, using the OcamIP31 environment devoted to ease the parallelism. This at the same time we test the code coupling and we use the natural parallel property of domain decomposition methods. Some simple 2D numerical tests show promising results, and further studies are under way. (authors)

  8. Domain decomposition with local refinement for flow simulation around a nuclear waste disposal site: direct computation versus simulation using code coupling with OCamlP3L

    International Nuclear Information System (INIS)

    Clement, F.; Vodicka, A.; Weis, P.; Martin, V.; Di Cosmo, R.

    2003-01-01

    We consider the application of a non-overlapping domain decomposition method with non-matching grids based on Robin interface conditions to the problem of flow surrounding an underground nuclear waste disposal. We show with a simple example how one can refine the mesh locally around the storage with this technique. A second aspect is studied in this paper. The coupling between the sub-domains can be achieved by computing in two ways: either directly (i.e. the domain decomposition algorithm is included in the code that solves the problems on the sub-domains) or using code coupling. In the latter case, each sub-domain problem is solved separately and the coupling is performed by another program. We wrote a coupling program in the functional language Ocaml, using the OcamIP31 environment devoted to ease the parallelism. This at the same time we test the code coupling and we use the natural parallel property of domain decomposition methods. Some simple 2D numerical tests show promising results, and further studies are under way. (authors)

  9. THE USE OF COMPUTER APPLICATIONS IN THE STUDY OF ROMANIA'S PUBLIC DEBT

    Directory of Open Access Journals (Sweden)

    Popeanga Vasile

    2011-07-01

    Full Text Available Total public debt represents all monetary obligations of the state (government, public institutions, financial, administrative-territorial units at a time, resulting from internal and external loans (in lei and foreign currencies contracted on short, medium and long term, and the state treasury and its own obligations for the amounts advanced temporarily to cover the budget deficit. Loans may be contracted by the state through the Ministry of Finance, in his own name or guaranteed by it. Public debt is expressed in local currency or foreign currency, depending on where the contracts and loan conditions. In order to evaluate Romania's public debt, obligations denominated in another currency than the national currency is calculated using the exchange rate of National Bank of Romania. Also, total public debt of a country can be expressed in absolute values (to know the load on that country's economy which is subject to its creditors, the relative values as a percentage of GDP (to allow comparison over time and between countries and the average size per capita (to allow comparisons and analysis in time and space. Total public debt is calculated and separately manages its two forms, namely domestic public debt and external public debt. Ministry of Finance shall prepare and submit annually to the Government for approval and to Parliament for information, report on public debt, which contains information on government debt portfolio, debt service, public indebtedness indicators and information about primary and secondary market securities state and how to implement the medium-term strategy in managing government debt for the previous year. In order to make comparisons quick and effective on public debt dynamics in Romania, Excel 2010 has new features such as charts and sparkline slicers features which can help discover trends and statistics in accordance with existing data. The aim of this article is accurate assessment of Romania's public debt and its

  10. Sources and Resources Into the Dark Domain: The UK Web Archive as a Source for the Contemporary History of Public Health.

    Science.gov (United States)

    Gorsky, Martin

    2015-08-01

    With the migration of the written record from paper to digital format, archivists and historians must urgently consider how web content should be conserved, retrieved and analysed. The British Library has recently acquired a large number of UK domain websites, captured 1996-2010, which is colloquially termed the Dark Domain Archive while technical issues surrounding user access are resolved. This article reports the results of an invited pilot project that explores methodological issues surrounding use of this archive. It asks how the relationship between UK public health and local government was represented on the web, drawing on the 'declinist' historiography to frame its questions. It points up some difficulties in developing an aggregate picture of web content due to duplication of sites. It also highlights their potential for thematic and discourse analysis, using both text and image, illustrated through an argument about the contradictory rationale for public health policy under New Labour.

  11. Combining Public Domain and Professional Panoramic Imagery for the Accurate and Dense 3d Reconstruction of the Destroyed Bel Temple in Palmyra

    Science.gov (United States)

    Wahbeh, W.; Nebiker, S.; Fangi, G.

    2016-06-01

    This paper exploits the potential of dense multi-image 3d reconstruction of destroyed cultural heritage monuments by either using public domain touristic imagery only or by combining the public domain imagery with professional panoramic imagery. The focus of our work is placed on the reconstruction of the temple of Bel, one of the Syrian heritage monuments, which was destroyed in September 2015 by the so called "Islamic State". The great temple of Bel is considered as one of the most important religious buildings of the 1st century AD in the East with a unique design. The investigations and the reconstruction were carried out using two types of imagery. The first are freely available generic touristic photos collected from the web. The second are panoramic images captured in 2010 for documenting those monuments. In the paper we present a 3d reconstruction workflow for both types of imagery using state-of-the art dense image matching software, addressing the non-trivial challenges of combining uncalibrated public domain imagery with panoramic images with very wide base-lines. We subsequently investigate the aspects of accuracy and completeness obtainable from the public domain touristic images alone and from the combination with spherical panoramas. We furthermore discuss the challenges of co-registering the weakly connected 3d point cloud fragments resulting from the limited coverage of the touristic photos. We then describe an approach using spherical photogrammetry as a virtual topographic survey allowing the co-registration of a detailed and accurate single 3d model of the temple interior and exterior.

  12. Opening of energy markets: consequences on the missions of public utility and of security of supplies in the domain of electric power and gas

    International Nuclear Information System (INIS)

    2001-01-01

    This conference was jointly organized by the International Energy Agency (IEA) and the French ministry of economy, finances, and industry (general direction of energy and raw materials, DGEMP). It was organized in 6 sessions dealing with: 1 - the public utility in the domain of energy: definition of the public utility missions, experience feedback about liberalized markets, public utility obligation and pricing regulation; 2 - the new US energy policy and the lessons learnt from the California crisis; 3 - the security of electric power supplies: concepts of security of supplies, opinion of operators, security of power supplies versus liberalization and investments; 4 - security of gas supplies: markets liberalization and investments, long-term contracts and security of supplies; 5 - debate: how to integrate the objectives of public utility and of security of supplies in a competing market; 6 - conclusions. This document brings together the available talks and transparencies presented at the conference. (J.S.)

  13. Design and study of parallel computing environment of Monte Carlo simulation for particle therapy planning using a public cloud-computing infrastructure

    International Nuclear Information System (INIS)

    Yokohama, Noriya

    2013-01-01

    This report was aimed at structuring the design of architectures and studying performance measurement of a parallel computing environment using a Monte Carlo simulation for particle therapy using a high performance computing (HPC) instance within a public cloud-computing infrastructure. Performance measurements showed an approximately 28 times faster speed than seen with single-thread architecture, combined with improved stability. A study of methods of optimizing the system operations also indicated lower cost. (author)

  14. Computational analysis and prediction of the binding motif and protein interacting partners of the Abl SH3 domain.

    Directory of Open Access Journals (Sweden)

    Tingjun Hou

    2006-01-01

    Full Text Available Protein-protein interactions, particularly weak and transient ones, are often mediated by peptide recognition domains, such as Src Homology 2 and 3 (SH2 and SH3 domains, which bind to specific sequence and structural motifs. It is important but challenging to determine the binding specificity of these domains accurately and to predict their physiological interacting partners. In this study, the interactions between 35 peptide ligands (15 binders and 20 non-binders and the Abl SH3 domain were analyzed using molecular dynamics simulation and the Molecular Mechanics/Poisson-Boltzmann Solvent Area method. The calculated binding free energies correlated well with the rank order of the binding peptides and clearly distinguished binders from non-binders. Free energy component analysis revealed that the van der Waals interactions dictate the binding strength of peptides, whereas the binding specificity is determined by the electrostatic interaction and the polar contribution of desolvation. The binding motif of the Abl SH3 domain was then determined by a virtual mutagenesis method, which mutates the residue at each position of the template peptide relative to all other 19 amino acids and calculates the binding free energy difference between the template and the mutated peptides using the Molecular Mechanics/Poisson-Boltzmann Solvent Area method. A single position mutation free energy profile was thus established and used as a scoring matrix to search peptides recognized by the Abl SH3 domain in the human genome. Our approach successfully picked ten out of 13 experimentally determined binding partners of the Abl SH3 domain among the top 600 candidates from the 218,540 decapeptides with the PXXP motif in the SWISS-PROT database. We expect that this physical-principle based method can be applied to other protein domains as well.

  15. SURF: a subroutine code to draw the axonometric projection of a surface generated by a scalar function over a discretized plane domain using finite element computations

    International Nuclear Information System (INIS)

    Giuliani, Giovanni; Giuliani, Silvano.

    1980-01-01

    The FORTRAN IV subroutine SURF has been designed to help visualising the results of Finite Element computations. It drawns the axonometric projection of a surface generated in 3-dimensional space by a scalar function over a discretized plane domain. The most important characteristic of the routine is to remove the hidden lines and in this way it enables a clear vision of the details of the generated surface

  16. FCJ-133 The Scripted Spaces of Urban Ubiquitous Computing: The experience, poetics, and politics of public scripted space

    Directory of Open Access Journals (Sweden)

    Christian Ulrik Andersen

    2011-12-01

    Full Text Available This article proposes and introduces the concept of ‘scripted space’ as a new perspective on ubiquitous computing in urban environments. Drawing on urban history, computer games, and a workshop study of the city of Lund the article discusses the experience of digitally scripted spaces, and their relation to the history of public spaces. In conclusion, the article discusses the potential for employing scripted spaces as a reinvigoration of urban public space.

  17. High-performance parallel computing in the classroom using the public goods game as an example

    Science.gov (United States)

    Perc, Matjaž

    2017-07-01

    The use of computers in statistical physics is common because the sheer number of equations that describe the behaviour of an entire system particle by particle often makes it impossible to solve them exactly. Monte Carlo methods form a particularly important class of numerical methods for solving problems in statistical physics. Although these methods are simple in principle, their proper use requires a good command of statistical mechanics, as well as considerable computational resources. The aim of this paper is to demonstrate how the usage of widely accessible graphics cards on personal computers can elevate the computing power in Monte Carlo simulations by orders of magnitude, thus allowing live classroom demonstration of phenomena that would otherwise be out of reach. As an example, we use the public goods game on a square lattice where two strategies compete for common resources in a social dilemma situation. We show that the second-order phase transition to an absorbing phase in the system belongs to the directed percolation universality class, and we compare the time needed to arrive at this result by means of the main processor and by means of a suitable graphics card. Parallel computing on graphics processing units has been developed actively during the last decade, to the point where today the learning curve for entry is anything but steep for those familiar with programming. The subject is thus ripe for inclusion in graduate and advanced undergraduate curricula, and we hope that this paper will facilitate this process in the realm of physics education. To that end, we provide a documented source code for an easy reproduction of presented results and for further development of Monte Carlo simulations of similar systems.

  18. A reaction time advantage for calculating beliefs over public representations signals domain specificity for 'theory of mind'.

    Science.gov (United States)

    Cohen, Adam S; German, Tamsin C

    2010-06-01

    In a task where participants' overt task was to track the location of an object across a sequence of events, reaction times to unpredictable probes requiring an inference about a social agent's beliefs about the location of that object were obtained. Reaction times to false belief situations were faster than responses about the (false) contents of a map showing the location of the object (Experiment 1) and about the (false) direction of an arrow signaling the location of the object (Experiment 2). These results are consistent with developmental, neuro-imaging and neuropsychological evidence that there exist domain specific mechanisms within human cognition for encoding and reasoning about mental states. Specialization of these mechanisms may arise from either core cognitive architecture or via the accumulation of expertise in the social domain.

  19. Study on k-shortest paths with behavioral impedance domain from the intermodal public transportation system perspective

    OpenAIRE

    Pereira, Hernane Borges de Barros; Pérez Vidal, Lluís; Lozada, Eleazar G. Madrid

    2003-01-01

    Behavioral impedance domain consists of a theory on route planning for pedestrians, within which constraint management is considered. The goal of this paper is to present the k-shortest path model using the behavioral impedance approach. After the mathematical model building, optimization problem and resolution problem by a behavioral impedance algorithm, it is discussed how behavioral impedance cost function is embedded in the k-shortest path model. From the pedestrian's route planning persp...

  20. Domains and domain loss

    DEFF Research Database (Denmark)

    Haberland, Hartmut

    2005-01-01

    politicians and in the media, especially in the discussion whether some languages undergo ‘domain loss’ vis-à-vis powerful international languages like English. An objection that has been raised here is that domains, as originally conceived, are parameters of language choice and not properties of languages...

  1. 76 FR 12398 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Bureau of the Public Debt (BPD...

    Science.gov (United States)

    2011-03-07

    ... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2010-0034] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Bureau of the Public Debt (BPD))--Match Number 1304 AGENCY: Social Security... as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection...

  2. RISE OF BIOINFORMATICS AND COMPUTATIONAL BIOLOGY IN INDIA: A LOOK THROUGH PUBLICATIONS

    Directory of Open Access Journals (Sweden)

    Anjali Srivastava

    2017-09-01

    Full Text Available Computational biology and bioinformatics have been part and parcel of biomedical research for few decades now. However, the institutionalization of bioinformatics research took place with the establishment of Distributed Information Centres (DISCs in the research institutions of repute in various disciplines by the Department of Biotechnology, Government of India. Though, at initial stages, this endeavor was mainly focused on providing infrastructure for using information technology and internet based communication and tools for carrying out computational biology and in-silico assisted research in varied arena of research starting from disease biology to agricultural crops, spices, veterinary science and many more, the natural outcome of establishment of such facilities resulted into new experiments with bioinformatics tools. Thus, Biotechnology Information Systems (BTIS grew into a solid movement and a large number of publications started coming out of these centres. In the end of last century, bioinformatics started developing like a full-fledged research subject. In the last decade, a need was felt to actually make a factual estimation of the result of this endeavor of DBT which had, by then, established about two hundred centres in almost all disciplines of biomedical research. In a bid to evaluate the efforts and outcome of these centres, BTIS Centre at CSIR-CDRI, Lucknow was entrusted with collecting and collating the publications of these centres. However, when the full data was compiled, the DBT task force felt that the study must include Non-BTIS centres also so as to expand the report to have a glimpse of bioinformatics publications from the country.

  3. 36 CFR 1254.32 - What rules apply to public access use of the Internet on NARA-supplied computers?

    Science.gov (United States)

    2010-07-01

    ... access use of the Internet on NARA-supplied computers? 1254.32 Section 1254.32 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION PUBLIC AVAILABILITY AND USE USING RECORDS AND DONATED... for Internet use in all NARA research rooms. The number of workstations varies per location. We...

  4. To every manifest domain a CSP expression – a rôle for mereology in computer science

    DEFF Research Database (Denmark)

    Bjørner, Dines

    2018-01-01

    We give an abstract model of parts and part-hood relations, of Stanisław Lesniewski’s mereology. Mereology applies to software application domains such as the financial service industry, railway systems, road transport systems, health care, oil pipelines, secure [IT] systems, etc.We relate...

  5. Open Access and the Public Domain in Digital Data and Information for Science: Proceedings of an International Symposium

    National Research Council Canada - National Science Library

    Esanu, Julie

    2004-01-01

    .... On the one hand, the Internet provides valuable new opportunities for overcoming geographic limitations and the promise of unprecedented open access to public information for research on a global basis...

  6. The Computer Revolution and Physical Chemistry.

    Science.gov (United States)

    O'Brien, James F.

    1989-01-01

    Describes laboratory-oriented software programs that are short, time-saving, eliminate computational errors, and not found in public domain courseware. Program availability for IBM and Apple microcomputers is included. (RT)

  7. Computer simulations of sequency-dependent dielectric response of 90-degree domain walls in tetragonal barium titanate

    Czech Academy of Sciences Publication Activity Database

    Márton, Pavel; Hlinka, Jiří

    2008-01-01

    Roč. 373, č. 1 (2008), s. 139-144 ISSN 0015-0193 R&D Projects: GA ČR GA202/06/0411 Institutional research plan: CEZ:AV0Z10100520 Keywords : ferroelectric and ferroelastic domains * BaTiO 3 * Ginzburg-Landau theory * mobility Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 0.562, year: 2008

  8. It's a Wonderful Life: Using Public Domain Cinema Clips To Teach Affective Objectives and Illustrate Real-World Algebra Applications.

    Science.gov (United States)

    Palmer, Loretta

    A basic algebra unit was developed at Utah Valley State College to emphasize applications of mathematical concepts in the work world, using video and computer-generated graphics to integrate textual material. The course was implemented in three introductory algebra sections involving 80 students and taught algebraic concepts using such areas as…

  9. Computation of Scattering from Bodies of Revolution Using an Entire-Domain Basis Implementation of the Moment Method

    National Research Council Canada - National Science Library

    Ford, Arthur

    1999-01-01

    Research into improved calibration targets for measurement of radar cross-section has created a need for the ability to accurately compute the scattering from perfectly conducting bodies of revolution...

  10. The insurance industry and public-private collaborations as a vector to develop and spread EO technologies and techniques in the domain of Food Security: The Swiss Re case.

    Science.gov (United States)

    Coutu, S.; Ragaz, M.; Mäder, D.; Hammer, P.; Andriesse, M.; Güttinger, U.; Feyen, H.

    2017-12-01

    The insurance industry has been contributing to the resilient development of agriculture in multiple regions of the globe since the beginning of the 19th Century. It also has from the very beginning of the development of EO Sciences, kept a very close eye on the development of technologies and techniques in this domain. Recent advances in this area such as increased satellite imagery resolution, faster computation time and Big Data management combined with the ground-based knowledge from the insurance industry have offered farmers not only tools permitting better crop management, but also reliable and live yield coverage. This study presents several of these applications at different scales (industrial farming and micro-farming) and in different climate regions, with an emphasis on the limit of current products. Some of these limits such as lack of access of to ground data, R&D efforts or understanding of ground needs could be quickly overcome through closer public-private or private-private collaborations. However, despite a clear benefit for the Food Security nexus and potential win-win situations, those collaborations are not always simple to develop. We present here successful but also disappointing collaboration cases based on the Swiss Re experience, as a global insurance leader. As a conclusion, we highlight how academia, NGOs, governmental organization, start-ups and the insurance industry can get together to foster the development of EO in the domain of Food Security, and bring cutting-edge science to game changing industrial applications.

  11. ICRP Publication 116—the first ICRP/ICRU application of the male and female adult reference computational phantoms

    CERN Document Server

    Petoussi-Henss, Nina; Eckerman, Keith F; Endo, Akira; Hertel, Nolan; Hunt, John; Menzel, Hans G; Pelliccioni, Maurizio; Schlattl, Helmut; Zankl, Maria

    2014-01-01

    ICRP Publication 116 on `Conversion coefficients for radiological protection quantities for external radiation exposures', provides fluence-to-dose conversion coefficients for organ-absorbed doses and effective dose for various types of external exposures (ICRP 2010 ICRP Publication 116). The publication supersedes the ICRP Publication 74 (ICRP 1996 ICRP Publication 74, ICRU 1998 ICRU Report 57), including new particle types and expanding the energy ranges considered. The coefficients were calculated using the ICRP/ICRU computational phantoms (ICRP 2009 ICRP Publication 110) representing the reference adult male and reference adult female (ICRP 2002 ICRP Publication 89), together with a variety of Monte Carlo codes simulating the radiation transport in the body. Idealized whole-body irradiation from unidirectional and rotational parallel beams as well as isotropic irradiation was considered for a large variety of incident radiations and energy ranges. Comparison of the effective doses with operational quantit...

  12. Information Dissemination of Public Health Emergency on Social Networks and Intelligent Computation

    Directory of Open Access Journals (Sweden)

    Hongzhi Hu

    2015-01-01

    Full Text Available Due to the extensive social influence, public health emergency has attracted great attention in today’s society. The booming social network is becoming a main information dissemination platform of those events and caused high concerns in emergency management, among which a good prediction of information dissemination in social networks is necessary for estimating the event’s social impacts and making a proper strategy. However, information dissemination is largely affected by complex interactive activities and group behaviors in social network; the existing methods and models are limited to achieve a satisfactory prediction result due to the open changeable social connections and uncertain information processing behaviors. ACP (artificial societies, computational experiments, and parallel execution provides an effective way to simulate the real situation. In order to obtain better information dissemination prediction in social networks, this paper proposes an intelligent computation method under the framework of TDF (Theory-Data-Feedback based on ACP simulation system which was successfully applied to the analysis of A (H1N1 Flu emergency.

  13. Information Dissemination of Public Health Emergency on Social Networks and Intelligent Computation.

    Science.gov (United States)

    Hu, Hongzhi; Mao, Huajuan; Hu, Xiaohua; Hu, Feng; Sun, Xuemin; Jing, Zaiping; Duan, Yunsuo

    2015-01-01

    Due to the extensive social influence, public health emergency has attracted great attention in today's society. The booming social network is becoming a main information dissemination platform of those events and caused high concerns in emergency management, among which a good prediction of information dissemination in social networks is necessary for estimating the event's social impacts and making a proper strategy. However, information dissemination is largely affected by complex interactive activities and group behaviors in social network; the existing methods and models are limited to achieve a satisfactory prediction result due to the open changeable social connections and uncertain information processing behaviors. ACP (artificial societies, computational experiments, and parallel execution) provides an effective way to simulate the real situation. In order to obtain better information dissemination prediction in social networks, this paper proposes an intelligent computation method under the framework of TDF (Theory-Data-Feedback) based on ACP simulation system which was successfully applied to the analysis of A (H1N1) Flu emergency.

  14. Bringing numerous methods for expression and promoter analysis to a public cloud computing service.

    Science.gov (United States)

    Polanski, Krzysztof; Gao, Bo; Mason, Sam A; Brown, Paul; Ott, Sascha; Denby, Katherine J; Wild, David L

    2018-03-01

    Every year, a large number of novel algorithms are introduced to the scientific community for a myriad of applications, but using these across different research groups is often troublesome, due to suboptimal implementations and specific dependency requirements. This does not have to be the case, as public cloud computing services can easily house tractable implementations within self-contained dependency environments, making the methods easily accessible to a wider public. We have taken 14 popular methods, the majority related to expression data or promoter analysis, developed these up to a good implementation standard and housed the tools in isolated Docker containers which we integrated into the CyVerse Discovery Environment, making these easily usable for a wide community as part of the CyVerse UK project. The integrated apps can be found at http://www.cyverse.org/discovery-environment, while the raw code is available at https://github.com/cyversewarwick and the corresponding Docker images are housed at https://hub.docker.com/r/cyversewarwick/. info@cyverse.warwick.ac.uk or D.L.Wild@warwick.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  15. Exploiting heterogeneous publicly available data sources for drug safety surveillance: computational framework and case studies.

    Science.gov (United States)

    Koutkias, Vassilis G; Lillo-Le Louët, Agnès; Jaulent, Marie-Christine

    2017-02-01

    Driven by the need of pharmacovigilance centres and companies to routinely collect and review all available data about adverse drug reactions (ADRs) and adverse events of interest, we introduce and validate a computational framework exploiting dominant as well as emerging publicly available data sources for drug safety surveillance. Our approach relies on appropriate query formulation for data acquisition and subsequent filtering, transformation and joint visualization of the obtained data. We acquired data from the FDA Adverse Event Reporting System (FAERS), PubMed and Twitter. In order to assess the validity and the robustness of the approach, we elaborated on two important case studies, namely, clozapine-induced cardiomyopathy/myocarditis versus haloperidol-induced cardiomyopathy/myocarditis, and apixaban-induced cerebral hemorrhage. The analysis of the obtained data provided interesting insights (identification of potential patient and health-care professional experiences regarding ADRs in Twitter, information/arguments against an ADR existence across all sources), while illustrating the benefits (complementing data from multiple sources to strengthen/confirm evidence) and the underlying challenges (selecting search terms, data presentation) of exploiting heterogeneous information sources, thereby advocating the need for the proposed framework. This work contributes in establishing a continuous learning system for drug safety surveillance by exploiting heterogeneous publicly available data sources via appropriate support tools.

  16. Associations between neck musculoskeletal complaints and work related factors among public service computer workers in Kaunas

    Directory of Open Access Journals (Sweden)

    Gintaré Kaliniene

    2013-10-01

    Full Text Available Objectives:Information technologies have been developing very rapidly, also in the case of occupational activities. Epidemiological studies have shown that employees, who work with computers, are more likely to complain of musculoskeletal disorders (MSD. The aim of this study was to evaluate associations between neck MSD and individual and work related factors. Materials and Methods: The investigation which consisted of two parts - a questionnaire study (using Nordic Musculoskeletal questionnaire and Copenhagen Psychosocial Questionnaire and a direct observation (to evaluate ergonomic work environment using RULA method was carried out in three randomly selected public sector companies of Kaunas. The study population consisted of 513 public service office workers. Results: The survey showed that neck MSDs were very common in the investigated population. The prevalence rate amounted to 65.7%. According to our survey neck MSDs were significantly associated with older age, bigger work experience, high quantitative and cognitive job demands, working for longer than 2 h without taking a break as well as with higher ergonomic risk score. The fully adjusted model working for longer than 2 h without taking a break had the strongest associations with neck complaints. Conclusion: It was confirmed, that neck MSDs were significantly associated with individual factors as well as conditions of work, therefore, preventive acions against neck complaints should be oriented at psychosocial and ergonomic work environment as well as at individual factors.

  17. Computational Studies of the Active and Inactive Regulatory Domains of Response Regulator PhoP Using Molecular Dynamics Simulations.

    Science.gov (United States)

    Qing, Xiao-Yu; Steenackers, Hans; Venken, Tom; De Maeyer, Marc; Voet, Arnout

    2017-11-01

    The response regulator PhoP is part of the PhoP/PhoQ two-component system, which is responsible for regulating the expression of multiple genes involved in controlling virulence, biofilm formation, and resistance to antimicrobial peptides. Therefore, modulating the transcriptional function of the PhoP protein is a promising strategy for developing new antimicrobial agents. There is evidence suggesting that phosphorylation-mediated dimerization in the regulatory domain of PhoP is essential for its transcriptional function. Disruption or stabilization of protein-protein interactions at the dimerization interface may inhibit or enhance the expression of PhoP-dependent genes. In this study, we performed molecular dynamics simulations on the active and inactive dimers and monomers of the PhoP regulatory domains, followed by pocket-detecting screenings and a quantitative hot-spot analysis in order to assess the druggability of the protein. Consistent with prior hypothesis, the calculation of the binding free energy shows that phosphorylation enhances dimerization of PhoP. Furthermore, we have identified two different putative binding sites at the dimerization active site (the α4-β5-α5 face) with energetic "hot-spot" areas, which could be used to search for modulators of protein-protein interactions. This study delivers insight into the dynamics and druggability of the dimerization interface of the PhoP regulatory domain, and may serve as a basis for the rational identification of new antimicrobial drugs. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. A semantic-based method for extracting concept definitions from scientific publications: evaluation in the autism phenotype domain

    OpenAIRE

    Hassanpour, Saeed; O?Connor, Martin J; Das, Amar K

    2013-01-01

    Background A variety of informatics approaches have been developed that use information retrieval, NLP and text-mining techniques to identify biomedical concepts and relations within scientific publications or their sentences. These approaches have not typically addressed the challenge of extracting more complex knowledge such as biomedical definitions. In our efforts to facilitate knowledge acquisition of rule-based definitions of autism phenotypes, we have developed a novel semantic-based t...

  19. Structural models of zebrafish (Danio rerio NOD1 and NOD2 NACHT domains suggest differential ATP binding orientations: insights from computational modeling, docking and molecular dynamics simulations.

    Directory of Open Access Journals (Sweden)

    Jitendra Maharana

    Full Text Available Nucleotide-binding oligomerization domain-containing protein 1 (NOD1 and NOD2 are cytosolic pattern recognition receptors playing pivotal roles in innate immune signaling. NOD1 and NOD2 recognize bacterial peptidoglycan derivatives iE-DAP and MDP, respectively and undergoes conformational alternation and ATP-dependent self-oligomerization of NACHT domain followed by downstream signaling. Lack of structural adequacy of NACHT domain confines our understanding about the NOD-mediated signaling mechanism. Here, we predicted the structure of NACHT domain of both NOD1 and NOD2 from model organism zebrafish (Danio rerio using computational methods. Our study highlighted the differential ATP binding modes in NOD1 and NOD2. In NOD1, γ-phosphate of ATP faced toward the central nucleotide binding cavity like NLRC4, whereas in NOD2 the cavity was occupied by adenine moiety. The conserved 'Lysine' at Walker A formed hydrogen bonds (H-bonds and Aspartic acid (Walker B formed electrostatic interaction with ATP. At Sensor 1, Arg328 of NOD1 exhibited an H-bond with ATP, whereas corresponding Arg404 of NOD2 did not. 'Proline' of GxP motif (Pro386 of NOD1 and Pro464 of NOD2 interacted with adenine moiety and His511 at Sensor 2 of NOD1 interacted with γ-phosphate group of ATP. In contrast, His579 of NOD2 interacted with the adenine moiety having a relatively inverted orientation. Our findings are well supplemented with the molecular interaction of ATP with NLRC4, and consistent with mutagenesis data reported for human, which indicates evolutionary shared NOD signaling mechanism. Together, this study provides novel insights into ATP binding mechanism, and highlights the differential ATP binding modes in zebrafish NOD1 and NOD2.

  20. Application of multi-thread computing and domain decomposition to the 3-D neutronics Fem code Cronos

    International Nuclear Information System (INIS)

    Ragusa, J.C.

    2003-01-01

    The purpose of this paper is to present the parallelization of the flux solver and the isotopic depletion module of the code, either using Message Passing Interface (MPI) or OpenMP. Thread parallelism using OpenMP was used to parallelize the mixed dual FEM (finite element method) flux solver MINOS. Investigations regarding the opportunity of mixing parallelism paradigms will be discussed. The isotopic depletion module was parallelized using domain decomposition and MPI. An attempt at using OpenMP was unsuccessful and will be explained. This paper is organized as follows: the first section recalls the different types of parallelism. The mixed dual flux solver and its parallelization are then presented. In the third section, we describe the isotopic depletion solver and its parallelization; and finally conclude with some future perspectives. Parallel applications are mandatory for fine mesh 3-dimensional transport and simplified transport multigroup calculations. The MINOS solver of the FEM neutronics code CRONOS2 was parallelized using the directive based standard OpenMP. An efficiency of 80% (resp. 60%) was achieved with 2 (resp. 4) threads. Parallelization of the isotopic depletion solver was obtained using domain decomposition principles and MPI. Efficiencies greater than 90% were reached. These parallel implementations were tested on a shared memory symmetric multiprocessor (SMP) cluster machine. The OpenMP implementation in the solver MINOS is only the first step towards fully using the SMPs cluster potential with a mixed mode parallelism. Mixed mode parallelism can be achieved by combining message passing interface between clusters with OpenMP implicit parallelism within a cluster

  1. Application of multi-thread computing and domain decomposition to the 3-D neutronics Fem code Cronos

    Energy Technology Data Exchange (ETDEWEB)

    Ragusa, J.C. [CEA Saclay, Direction de l' Energie Nucleaire, Service d' Etudes des Reacteurs et de Modelisations Avancees (DEN/SERMA), 91 - Gif sur Yvette (France)

    2003-07-01

    The purpose of this paper is to present the parallelization of the flux solver and the isotopic depletion module of the code, either using Message Passing Interface (MPI) or OpenMP. Thread parallelism using OpenMP was used to parallelize the mixed dual FEM (finite element method) flux solver MINOS. Investigations regarding the opportunity of mixing parallelism paradigms will be discussed. The isotopic depletion module was parallelized using domain decomposition and MPI. An attempt at using OpenMP was unsuccessful and will be explained. This paper is organized as follows: the first section recalls the different types of parallelism. The mixed dual flux solver and its parallelization are then presented. In the third section, we describe the isotopic depletion solver and its parallelization; and finally conclude with some future perspectives. Parallel applications are mandatory for fine mesh 3-dimensional transport and simplified transport multigroup calculations. The MINOS solver of the FEM neutronics code CRONOS2 was parallelized using the directive based standard OpenMP. An efficiency of 80% (resp. 60%) was achieved with 2 (resp. 4) threads. Parallelization of the isotopic depletion solver was obtained using domain decomposition principles and MPI. Efficiencies greater than 90% were reached. These parallel implementations were tested on a shared memory symmetric multiprocessor (SMP) cluster machine. The OpenMP implementation in the solver MINOS is only the first step towards fully using the SMPs cluster potential with a mixed mode parallelism. Mixed mode parallelism can be achieved by combining message passing interface between clusters with OpenMP implicit parallelism within a cluster.

  2. Assessing Tax Form Distribution Costs: A Proposed Method for Computing the Dollar Value of Tax Form Distribution in a Public Library.

    Science.gov (United States)

    Casey, James B.

    1998-01-01

    Explains how a public library can compute the actual cost of distributing tax forms to the public by listing all direct and indirect costs and demonstrating the formulae and necessary computations. Supplies directions for calculating costs involved for all levels of staff as well as associated public relations efforts, space, and utility costs.…

  3. An assessment of mercury in estuarine sediment and tissue in Southern New Jersey using public domain data

    Science.gov (United States)

    Ng, Kara; Szabo, Zoltan; Reilly, Pamela A.; Barringer, Julia; Smalling, Kelly L.

    2016-01-01

    Mercury (Hg) is considered a contaminant of global concern for coastal environments due to its toxicity, widespread occurrence in sediment, and bioaccumulation in tissue. Coastal New Jersey, USA, is characterized by shallow bays and wetlands that provide critical habitat for wildlife but share space with expanding urban landscapes. This study was designed as an assessment of the magnitude and distribution of Hg in coastal New Jersey sediments and critical species using publicly available data to highlight potential data gaps. Mercury concentrations in estuary sediments can exceed 2 μg/g and correlate with concentrations of other metals. Based on existing data, the concentrations of Hg in mussels in southern New Jersey are comparable to those observed in other urbanized Atlantic Coast estuaries. Lack of methylmercury data for sediments, other media, and tissues are data gaps needing to be filled for a clearer understanding of the impacts of Hg inputs to the ecosystem.

  4. Publicity.

    Science.gov (United States)

    Chisholm, Joan

    Publicity for preschool cooperatives is described. Publicity helps produce financial support for preschool cooperatives. It may take the form of posters, brochures, newsletters, open house, newspaper coverage, and radio and television. Word of mouth and general good will in the community are the best avenues of publicity that a cooperative nursery…

  5. On the interpretability and computational reliability of frequency-domain Granger causality [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Luca Faes

    2017-09-01

    Full Text Available This Correspondence article is a comment which directly relates to the paper “A study of problems encountered in Granger causality analysis from a neuroscience perspective” (Stokes and Purdon, 2017. We agree that interpretation issues of Granger causality (GC in neuroscience exist, partially due to the historically unfortunate use of the name “causality”, as described in previous literature. On the other hand, we think that Stokes and Purdon use a formulation of GC which is outdated (albeit still used and do not fully account for the potential of the different frequency-domain versions of GC; in doing so, their paper dismisses GC measures based on a suboptimal use of them. Furthermore, since data from simulated systems are used, the pitfalls that are found with the used formulation are intended to be general, and not limited to neuroscience. It would be a pity if this paper, even if written in good faith, became a wildcard against all possible applications of GC, regardless of the large body of work recently published which aims to address faults in methodology and interpretation. In order to provide a balanced view, we replicate the simulations of Stokes and Purdon, using an updated GC implementation and exploiting the combination of spectral and causal information, showing that in this way the pitfalls are mitigated or directly solved.

  6. The national public's values and interests related to the Arctic National Wildlife Refuge: A computer content analysis

    Science.gov (United States)

    David N. Bengston; David P. Fan; Roger Kaye

    2010-01-01

    This study examined the national public's values and interests related to the Arctic National Wildlife Refuge. Computer content analysis was used to analyze more than 23,000 media stories about the refuge from 1995 through 2007. Ten main categories of Arctic National Wildlife Refuge values and interests emerged from the analysis, reflecting a diversity of values,...

  7. Domain decomposition method for solving elliptic problems in unbounded domains

    International Nuclear Information System (INIS)

    Khoromskij, B.N.; Mazurkevich, G.E.; Zhidkov, E.P.

    1991-01-01

    Computational aspects of the box domain decomposition (DD) method for solving boundary value problems in an unbounded domain are discussed. A new variant of the DD-method for elliptic problems in unbounded domains is suggested. It is based on the partitioning of an unbounded domain adapted to the given asymptotic decay of an unknown function at infinity. The comparison of computational expenditures is given for boundary integral method and the suggested DD-algorithm. 29 refs.; 2 figs.; 2 tabs

  8. 76 FR 62373 - Notice of Public Meeting-Cloud Computing Forum & Workshop IV

    Science.gov (United States)

    2011-10-07

    ...--Cloud Computing Forum & Workshop IV AGENCY: National Institute of Standards and Technology (NIST), Commerce. ACTION: Notice. SUMMARY: NIST announces the Cloud Computing Forum & Workshop IV to be held on... to help develop open standards in interoperability, portability and security in cloud computing. This...

  9. 77 FR 26509 - Notice of Public Meeting-Cloud Computing Forum & Workshop V

    Science.gov (United States)

    2012-05-04

    ...--Cloud Computing Forum & Workshop V AGENCY: National Institute of Standards & Technology (NIST), Commerce. ACTION: Notice. SUMMARY: NIST announces the Cloud Computing Forum & Workshop V to be held on Tuesday... workshop. This workshop will provide information on the U.S. Government (USG) Cloud Computing Technology...

  10. 77 FR 74829 - Notice of Public Meeting-Cloud Computing and Big Data Forum and Workshop

    Science.gov (United States)

    2012-12-18

    ...--Cloud Computing and Big Data Forum and Workshop AGENCY: National Institute of Standards and Technology... Standards and Technology (NIST) announces a Cloud Computing and Big Data Forum and Workshop to be held on... followed by a one-day hands-on workshop. The NIST Cloud Computing and Big Data Forum and Workshop will...

  11. Quantitative Literacy in the Affective Domain: Computational Geology Students’ Reactions to Devlin’s The Math Instinct

    Directory of Open Access Journals (Sweden)

    Victor J. Ricchezza

    2017-07-01

    Full Text Available Building on suggestions from alumni from a recent interview project, students in Computational Geology at the University of South Florida were tasked with reading a popular non-fiction book on mathematics and writing about the book and their feelings about math. The book, The Math Instinct by Keith Devlin, was chosen because we believed it would give the students something interesting to write about and not because we had any expectations in particular about what it might reveal about or do for their math anxiety. The nature of the responses received from the students led to the performance of a post-hoc study on the emotional affect of math in the students' lives and how it changed as they proceeded through the book and reflected back on it at the end. Of the 28 students in the fall 2016 section of the course, 25 had an improved or slightly improved attitude toward math by the end of the semester. The assignment was more successful than we could anticipate at generating thought and getting students to communicate about math – an integral component of quantitative literacy. Although the limited size and post hoc nature of the study make it difficult to generalize, the results are promising and invite further use of the assignment in the course.

  12. Domain analysis

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2017-01-01

    The domain-analytic approach to knowledge organization (KO) (and to the broader field of library and information science, LIS) is outlined. The article reviews the discussions and proposals on the definition of domains, and provides an example of a domain-analytic study in the field of art studies....... Varieties of domain analysis as well as criticism and controversies are presented and discussed....

  13. Public Computer Assisted Learning Facilities for Children with Visual Impairment: Universal Design for Inclusive Learning

    Science.gov (United States)

    Siu, Kin Wai Michael; Lam, Mei Seung

    2012-01-01

    Although computer assisted learning (CAL) is becoming increasingly popular, people with visual impairment face greater difficulty in accessing computer-assisted learning facilities. This is primarily because most of the current CAL facilities are not visually impaired friendly. People with visual impairment also do not normally have access to…

  14. A computational approach identifies two regions of Hepatitis C Virus E1 protein as interacting domains involved in viral fusion process

    Directory of Open Access Journals (Sweden)

    El Sawaf Gamal

    2009-07-01

    Full Text Available Abstract Background The E1 protein of Hepatitis C Virus (HCV can be dissected into two distinct hydrophobic regions: a central domain containing an hypothetical fusion peptide (FP, and a C-terminal domain (CT comprising two segments, a pre-anchor and a trans-membrane (TM region. In the currently accepted model of the viral fusion process, the FP and the TM regions are considered to be closely juxtaposed in the post-fusion structure and their physical interaction cannot be excluded. In the present study, we took advantage of the natural sequence variability present among HCV strains to test, by purely sequence-based computational tools, the hypothesis that in this virus the fusion process involves the physical interaction of the FP and CT regions of E1. Results Two computational approaches were applied. The first one is based on the co-evolution paradigm of interacting peptides and consequently on the correlation between the distance matrices generated by the sequence alignment method applied to FP and CT primary structures, respectively. In spite of the relatively low random genetic drift between genotypes, co-evolution analysis of sequences from five HCV genotypes revealed a greater correlation between the FP and CT domains than respect to a control HCV sequence from Core protein, so giving a clear, albeit still inconclusive, support to the physical interaction hypothesis. The second approach relies upon a non-linear signal analysis method widely used in protein science called Recurrence Quantification Analysis (RQA. This method allows for a direct comparison of domains for the presence of common hydrophobicity patterns, on which the physical interaction is based upon. RQA greatly strengthened the reliability of the hypothesis by the scoring of a lot of cross-recurrences between FP and CT peptides hydrophobicity patterning largely outnumbering chance expectations and pointing to putative interaction sites. Intriguingly, mutations in the CT

  15. Publication and Retrieval of Computational Chemical-Physical Data Via the Semantic Web. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Ostlund, Neil [Chemical Semantics, Inc., Gainesville, FL (United States)

    2017-07-20

    This research showed the feasibility of applying the concepts of the Semantic Web to Computation Chemistry. We have created the first web portal (www.chemsem.com) that allows data created in the calculations of quantum chemistry, and other such chemistry calculations to be placed on the web in a way that makes the data accessible to scientists in a semantic form never before possible. The semantic web nature of the portal allows data to be searched, found, and used as an advance over the usual approach of a relational database. The semantic data on our portal has the nature of a Giant Global Graph (GGG) that can be easily merged with related data and searched globally via a SPARQL Protocol and RDF Query Language (SPARQL) that makes global searches for data easier than with traditional methods. Our Semantic Web Portal requires that the data be understood by a computer and hence defined by an ontology (vocabulary). This ontology is used by the computer in understanding the data. We have created such an ontology for computational chemistry (purl.org/gc) that encapsulates a broad knowledge of the field of computational chemistry. We refer to this ontology as the Gainesville Core. While it is perhaps the first ontology for computational chemistry and is used by our portal, it is only a start of what must be a long multi-partner effort to define computational chemistry. In conjunction with the above efforts we have defined a new potential file standard (Common Standard for eXchange – CSX for computational chemistry data). This CSX file is the precursor of data in the Resource Description Framework (RDF) form that the semantic web requires. Our portal translates CSX files (as well as other computational chemistry data files) into RDF files that are part of the graph database that the semantic web employs. We propose a CSX file as a convenient way to encapsulate computational chemistry data.

  16. Does protecting humans protect the environment? A crude examination for UK nuclear power plants and the marine environment using information in the public domain

    International Nuclear Information System (INIS)

    Brownless, G P

    2008-01-01

    Current activity around radiological protection of the environment implies concerns over the effectiveness of the current approach to this-namely if humans are adequately protected, then so are non-human species. This study uses models and data currently available in the public domain to carry out a 'quick and dirty' examination of whether protecting humans does indeed imply that other species are well protected. Using marine discharges and human habits data for operational coastal UK nuclear power stations, this study compares doses to humans and a set of reference non-human species. The study concludes that the availability of data and models, and consequent ease of studying potential effects on non-humans (as well as humans), vindicates recent efforts in this area, and that these imply a high level of protection, in general, for non-human biota from UK nuclear power station marine discharges. In general terms, the study finds that protection of non-human biota relies on taking ingestion and external exposure doses to humans into account; where only one of these pathways is considered, the level of protection of non-human biota through protection of humans would depend on the radionuclide(s) in question.

  17. Selected Publications in Image Understanding and Computer Vision from 1974 to 1983

    Science.gov (United States)

    1985-04-18

    Germany, September 26-28, 1978), Plenum, New York, 1979. 9. Reconnaissance des Formes et Intelligence Artificielle (2’me Congres AFCET-IRIA, Toulouse...the last decade. .To L..... ABBREVIATIONS - AI Artificial Intelligence BC Biological Cybernetics CACM Communications of the ACM CG Computer Graphics... Intelligence PACM Proceedings of the ACM "P-IEEE Proceedings of the IEEE P-NCC Proceedings of the National Computer Conference PR Pattern Recognition PRL

  18. Computational fluid dynamics and frequency-dependent finite-difference time-domain method coupling for the interaction between microwaves and plasma in rocket plumes

    International Nuclear Information System (INIS)

    Kinefuchi, K.; Funaki, I.; Shimada, T.; Abe, T.

    2012-01-01

    Under certain conditions during rocket flights, ionized exhaust plumes from solid rocket motors may interfere with radio frequency transmissions. To understand the relevant physical processes involved in this phenomenon and establish a prediction process for in-flight attenuation levels, we attempted to measure microwave attenuation caused by rocket exhaust plumes in a sea-level static firing test for a full-scale solid propellant rocket motor. The microwave attenuation level was calculated by a coupling simulation of the inviscid-frozen-flow computational fluid dynamics of an exhaust plume and detailed analysis of microwave transmissions by applying a frequency-dependent finite-difference time-domain method with the Drude dispersion model. The calculated microwave attenuation level agreed well with the experimental results, except in the case of interference downstream the Mach disk in the exhaust plume. It was concluded that the coupling estimation method based on the physics of the frozen plasma flow with Drude dispersion would be suitable for actual flight conditions, although the mixing and afterburning in the plume should be considered depending on the flow condition.

  19. Computational fluid dynamics and frequency-dependent finite-difference time-domain method coupling for the interaction between microwaves and plasma in rocket plumes

    Energy Technology Data Exchange (ETDEWEB)

    Kinefuchi, K. [Department of Aeronautics and Astronautics, University of Tokyo, 7-3-1, Hongo, Bunkyo-ku, Tokyo 113-8656 (Japan); Funaki, I.; Shimada, T.; Abe, T. [Institute of Space and Astronautical Science, Japan Aerospace Exploration Agency, 3-1-1, Yoshinodai, Chuo-ku, Sagamihara, Kanagawa 252-5210 (Japan)

    2012-10-15

    Under certain conditions during rocket flights, ionized exhaust plumes from solid rocket motors may interfere with radio frequency transmissions. To understand the relevant physical processes involved in this phenomenon and establish a prediction process for in-flight attenuation levels, we attempted to measure microwave attenuation caused by rocket exhaust plumes in a sea-level static firing test for a full-scale solid propellant rocket motor. The microwave attenuation level was calculated by a coupling simulation of the inviscid-frozen-flow computational fluid dynamics of an exhaust plume and detailed analysis of microwave transmissions by applying a frequency-dependent finite-difference time-domain method with the Drude dispersion model. The calculated microwave attenuation level agreed well with the experimental results, except in the case of interference downstream the Mach disk in the exhaust plume. It was concluded that the coupling estimation method based on the physics of the frozen plasma flow with Drude dispersion would be suitable for actual flight conditions, although the mixing and afterburning in the plume should be considered depending on the flow condition.

  20. The GLEaMviz computational tool, a publicly available software to explore realistic epidemic spreading scenarios at the global scale

    Directory of Open Access Journals (Sweden)

    Quaggiotto Marco

    2011-02-01

    Full Text Available Abstract Background Computational models play an increasingly important role in the assessment and control of public health crises, as demonstrated during the 2009 H1N1 influenza pandemic. Much research has been done in recent years in the development of sophisticated data-driven models for realistic computer-based simulations of infectious disease spreading. However, only a few computational tools are presently available for assessing scenarios, predicting epidemic evolutions, and managing health emergencies that can benefit a broad audience of users including policy makers and health institutions. Results We present "GLEaMviz", a publicly available software system that simulates the spread of emerging human-to-human infectious diseases across the world. The GLEaMviz tool comprises three components: the client application, the proxy middleware, and the simulation engine. The latter two components constitute the GLEaMviz server. The simulation engine leverages on the Global Epidemic and Mobility (GLEaM framework, a stochastic computational scheme that integrates worldwide high-resolution demographic and mobility data to simulate disease spread on the global scale. The GLEaMviz design aims at maximizing flexibility in defining the disease compartmental model and configuring the simulation scenario; it allows the user to set a variety of parameters including: compartment-specific features, transition values, and environmental effects. The output is a dynamic map and a corresponding set of charts that quantitatively describe the geo-temporal evolution of the disease. The software is designed as a client-server system. The multi-platform client, which can be installed on the user's local machine, is used to set up simulations that will be executed on the server, thus avoiding specific requirements for large computational capabilities on the user side. Conclusions The user-friendly graphical interface of the GLEaMviz tool, along with its high level

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  2. Computer-Based Video Instruction to Teach Students with Intellectual Disabilities to Use Public Bus Transportation

    Science.gov (United States)

    Mechling, Linda; O'Brien, Eileen

    2010-01-01

    This study investigated the effectiveness of computer-based video instruction (CBVI) to teach three young adults with moderate intellectual disabilities to push a "request to stop bus signal" and exit a city bus in response to target landmarks. A multiple probe design across three students and one bus route was used to evaluate effectiveness of…

  3. Computers in Education: An Overview. Publication Number One. Software Engineering/Education Cooperative Project.

    Science.gov (United States)

    Collis, Betty; Muir, Walter

    The first of four major sections in this report presents an overview of the background and evolution of computer applications to learning and teaching. It begins with the early attempts toward "automated teaching" of the 1920s, and the "teaching machines" of B. F. Skinner of the 1940s through the 1960s. It then traces the…

  4. Deciphering Dimerization Modes of PAS Domains: Computational and Experimental Analyses of the AhR:ARNT Complex Reveal New Insights Into the Mechanisms of AhR Transformation.

    Science.gov (United States)

    Corrada, Dario; Soshilov, Anatoly A; Denison, Michael S; Bonati, Laura

    2016-06-01

    The Aryl hydrocarbon Receptor (AhR) is a transcription factor that mediates the biochemical response to xenobiotics and the toxic effects of a number of environmental contaminants, including dioxins. Recently, endogenous regulatory roles for the AhR in normal physiology and development have also been reported, thus extending the interest in understanding its molecular mechanisms of activation. Since dimerization with the AhR Nuclear Translocator (ARNT) protein, occurring through the Helix-Loop-Helix (HLH) and PER-ARNT-SIM (PAS) domains, is needed to convert the AhR into its transcriptionally active form, deciphering the AhR:ARNT dimerization mode would provide insights into the mechanisms of AhR transformation. Here we present homology models of the murine AhR:ARNT PAS domain dimer developed using recently available X-ray structures of other bHLH-PAS protein dimers. Due to the different reciprocal orientation and interaction surfaces in the different template dimers, two alternative models were developed for both the PAS-A and PAS-B dimers and they were characterized by combining a number of computational evaluations. Both well-established hot spot prediction methods and new approaches to analyze individual residue and residue-pairwise contributions to the MM-GBSA binding free energies were adopted to predict residues critical for dimer stabilization. On this basis, a mutagenesis strategy for both the murine AhR and ARNT proteins was designed and ligand-dependent DNA binding ability of the AhR:ARNT heterodimer mutants was evaluated. While functional analysis disfavored the HIF2α:ARNT heterodimer-based PAS-B model, most mutants derived from the CLOCK:BMAL1-based AhR:ARNT dimer models of both the PAS-A and the PAS-B dramatically decreased the levels of DNA binding, suggesting this latter model as the most suitable for describing AhR:ARNT dimerization. These novel results open new research directions focused at elucidating basic molecular mechanisms underlying the

  5. Deciphering Dimerization Modes of PAS Domains: Computational and Experimental Analyses of the AhR:ARNT Complex Reveal New Insights Into the Mechanisms of AhR Transformation.

    Directory of Open Access Journals (Sweden)

    Dario Corrada

    2016-06-01

    Full Text Available The Aryl hydrocarbon Receptor (AhR is a transcription factor that mediates the biochemical response to xenobiotics and the toxic effects of a number of environmental contaminants, including dioxins. Recently, endogenous regulatory roles for the AhR in normal physiology and development have also been reported, thus extending the interest in understanding its molecular mechanisms of activation. Since dimerization with the AhR Nuclear Translocator (ARNT protein, occurring through the Helix-Loop-Helix (HLH and PER-ARNT-SIM (PAS domains, is needed to convert the AhR into its transcriptionally active form, deciphering the AhR:ARNT dimerization mode would provide insights into the mechanisms of AhR transformation. Here we present homology models of the murine AhR:ARNT PAS domain dimer developed using recently available X-ray structures of other bHLH-PAS protein dimers. Due to the different reciprocal orientation and interaction surfaces in the different template dimers, two alternative models were developed for both the PAS-A and PAS-B dimers and they were characterized by combining a number of computational evaluations. Both well-established hot spot prediction methods and new approaches to analyze individual residue and residue-pairwise contributions to the MM-GBSA binding free energies were adopted to predict residues critical for dimer stabilization. On this basis, a mutagenesis strategy for both the murine AhR and ARNT proteins was designed and ligand-dependent DNA binding ability of the AhR:ARNT heterodimer mutants was evaluated. While functional analysis disfavored the HIF2α:ARNT heterodimer-based PAS-B model, most mutants derived from the CLOCK:BMAL1-based AhR:ARNT dimer models of both the PAS-A and the PAS-B dramatically decreased the levels of DNA binding, suggesting this latter model as the most suitable for describing AhR:ARNT dimerization. These novel results open new research directions focused at elucidating basic molecular mechanisms

  6. Factors Influencing the Adoption of and Business Case for Cloud Computing in the Public Sector

    NARCIS (Netherlands)

    Kuiper, E.; Van Dam, F.; Reiter, A.; Janssen, M.F.W.H.A.

    2014-01-01

    Cloud adoption in the public sector is taking off slowly, which is perceived as a problem. Models of factors influencing cloud adoption are derived for better understanding using literature and results obtained via desk research and surveys by the Cloud for Europe project. We conclude that several

  7. Energy expenditure of three public and three home based active computer games in children.

    NARCIS (Netherlands)

    Simons, M.; de Vries, S.I.; Jongert, M.W.A.

    2014-01-01

    The purpose of this study was to assess the energy expenditure (EE) experienced by children when playing six active video games, which can be used in a home environment and in a public setting (e.g. game center), and to evaluate whether the intensity of playing these games can meet the threshold for

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  9. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  11. The Jupyter/IPython architecture: a unified view of computational research, from interactive exploration to communication and publication.

    Science.gov (United States)

    Ragan-Kelley, M.; Perez, F.; Granger, B.; Kluyver, T.; Ivanov, P.; Frederic, J.; Bussonnier, M.

    2014-12-01

    IPython has provided terminal-based tools for interactive computing in Python since 2001. The notebook document format and multi-process architecture introduced in 2011 have expanded the applicable scope of IPython into teaching, presenting, and sharing computational work, in addition to interactive exploration. The new architecture also allows users to work in any language, with implementations in Python, R, Julia, Haskell, and several other languages. The language agnostic parts of IPython have been renamed to Jupyter, to better capture the notion that a cross-language design can encapsulate commonalities present in computational research regardless of the programming language being used. This architecture offers components like the web-based Notebook interface, that supports rich documents that combine code and computational results with text narratives, mathematics, images, video and any media that a modern browser can display. This interface can be used not only in research, but also for publication and education, as notebooks can be converted to a variety of output formats, including HTML and PDF. Recent developments in the Jupyter project include a multi-user environment for hosting notebooks for a class or research group, a live collaboration notebook via Google Docs, and better support for languages other than Python.

  12. Factors Influencing the Adoption of and Business Case for Cloud Computing in the Public Sector

    OpenAIRE

    Kuiper, E.; Van Dam, F.; Reiter, A.; Janssen, M.F.W.H.A.

    2014-01-01

    Cloud adoption in the public sector is taking off slowly, which is perceived as a problem. Models of factors influencing cloud adoption are derived for better understanding using literature and results obtained via desk research and surveys by the Cloud for Europe project. We conclude that several factors require further research, such as the culture in countries, climate, legislation, economics and politics, IT staff shortage and feelings of uncertainty, fear and impatience. Adoption factors...

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  14. Reference computations of public dose and cancer risk from airborne releases of plutonium. Nuclear safety technical report

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, V.L.

    1993-12-23

    This report presents results of computations of doses and the associated health risks of postulated accidental atmospheric releases from the Rocky Flats Plant (RFP) of one gram of weapons-grade plutonium in a form that is respirable. These computations are intended to be reference computations that can be used to evaluate a variety of accident scenarios by scaling the dose and health risk results presented here according to the amount of plutonium postulated to be released, instead of repeating the computations for each scenario. The MACCS2 code has been used as the basis of these computations. The basis and capabilities of MACCS2 are summarized, the parameters used in the evaluations are discussed, and results are presented for the doses and health risks to the public, both the Maximum Offsite Individual (a maximally exposed individual at or beyond the plant boundaries) and the population within 50 miles of RFP. A number of different weather scenarios are evaluated, including constant weather conditions and observed weather for 1990, 1991, and 1992. The isotopic mix of weapons-grade plutonium will change as it ages, the {sup 241}Pu decaying into {sup 241}Am. The {sup 241}Am reaches a peak concentration after about 72 years. The doses to the bone surface, liver, and whole body will increase slightly but the dose to the lungs will decrease slightly. The overall cancer risk will show almost no change over this period. This change in cancer risk is much smaller than the year-to-year variations in cancer risk due to weather. Finally, x/Q values are also presented for other applications, such as for hazardous chemical releases. These include the x/Q values for the MOI, for a collocated worker at 100 meters downwind of an accident site, and the x/Q value integrated over the population out to 50 miles.

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  17. ACUTRI a computer code for assessing doses to the general public due to acute tritium releases

    CERN Document Server

    Yokoyama, S; Noguchi, H; Ryufuku, S; Sasaki, T

    2002-01-01

    Tritium, which is used as a fuel of a D-T burning fusion reactor, is the most important radionuclide for the safety assessment of a nuclear fusion experimental reactor such as ITER. Thus, a computer code, ACUTRI, which calculates the radiological impact of tritium released accidentally to the atmosphere, has been developed, aiming to be of use in a discussion of licensing of a fusion experimental reactor and an environmental safety evaluation method in Japan. ACUTRI calculates an individual tritium dose based on transfer models specific to tritium in the environment and ICRP dose models. In this calculation it is also possible to analyze statistically on meteorology in the same way as a conventional dose assessment method according to the meteorological guide of the Nuclear Safety Commission of Japan. A Gaussian plume model is used for calculating the atmospheric dispersion of tritium gas (HT) and/or tritiated water (HTO). The environmental pathway model in ACUTRI considers the following internal exposures: i...

  18. Virtual Space Exploration: Let's Use Web-Based Computer Game Technology to Boost IYA 2009 Public Interest

    Science.gov (United States)

    Hussey, K.; Doronila, P.; Kulikov, A.; Lane, K.; Upchurch, P.; Howard, J.; Harvey, S.; Woodmansee, L.

    2008-09-01

    With the recent releases of both Google's "Sky" and Microsoft's "WorldWide Telescope" and the large and increasing popularity of video games, the time is now for using these tools, and those crafted at NASA's Jet Propulsion Laboratory, to engage the public in astronomy like never before. This presentation will use "Cassini at Saturn Interactive Explorer " (CASSIE) to demonstrate the power of web-based video-game engine technology in providing the public a "first-person" look at space exploration. The concept of virtual space exploration is to allow the public to "see" objects in space as if they were either riding aboard or "flying" next to an ESA/NASA spacecraft. Using this technology, people are able to immediately "look" in any direction from their virtual location in space and "zoom-in" at will. Users can position themselves near Saturn's moons and observe the Cassini Spacecraft's "encounters" as they happened. Whenever real data for their "view" exists it is incorporated into the scene. Where data is missing, a high-fidelity simulation of the view is generated to fill in the scene. The observer can also change the time of observation into the past or future. Our approach is to utilize and extend the Unity 3d game development tool, currently in use by the computer gaming industry, along with JPL mission specific telemetry and instrument data to build our virtual explorer. The potential of the application of game technology for the development of educational curricula and public engagement are huge. We believe this technology can revolutionize the way the general public and the planetary science community views ESA/NASA missions and provides an educational context that is attractive to the younger generation. This technology is currently under development and application at JPL to assist our missions in viewing their data, communicating with the public and visualizing future mission plans. Real-time demonstrations of CASSIE and other applications in development

  19. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  6. La apropiación del dominio público y las posibilidades de acceso a los bienes culturales | The appropriation of the public domain and the possibilities of access to cultural goods

    Directory of Open Access Journals (Sweden)

    Joan Ramos Toledano

    2017-06-01

    Full Text Available Resumen: Las normas de propiedad intelectual y copyright prevén un periodo de protección otorgando unos derechos económicos exclusivos y temporales. Pasado un plazo determinado, las obras protegidas entran en lo que se denomina dominio público. Éste suele ser considerado como el momento en el que los bienes culturales pasan a estar bajo el dominio y control de la sociedad en conjunto. El presente trabajo pretende argumentar que, dado nuestro actual sistema económico, en realidad el dominio público funciona más como una posibilidad de negocio para determinadas empresas que como una verdadera opción para que el público pueda acceder a las obras.   Abstract: The legislation of continental intellectual property and copyright provide for a period of protection granting exclusive and temporary economic rights. After a certain period, protected works enter into what is called the public domain. This is often considered as the moment in which the cultural goods come under the control and domain of society as a whole. The present paper pretends to argue that, given our current economic system, the public domain actually functions more as a business opportunity for certain companies than as a real option for the public to access artistic and intellectual works.

  7. Time-domain numerical computations of electromagnetic fields in cylindrical co-ordinates using the transmission line matrix: evaluation of radiaion losses from a charge bunch passing through a pill-box resonator

    International Nuclear Information System (INIS)

    Sarma, J.; Robson, P.N.

    1979-01-01

    The two dimensional transmission line matrix (TLM) numerical method has been adapted to compute electromagnetic field distributions in cylindrical co-ordinates and it is applied to evaluate the radiation loss from a charge bunch passing through a 'pill-box' resonator. The computer program has been developed to calculate not only the total energy loss to the resonator but also that component of it which exists in the TM 010 mode. The numerically computed results are shown to agree very well with the analytically derived values as found in the literature which, therefore, established the degree of accuracy that is obtained with the TLM method. The particular features of computational simplicity, numerical stability and the inherently time-domain solutions produced by the TLM method are cited as additional, attractive reasons for using this numerical procedure in solving such problems. (Auth.)

  8. A New Heuristic Anonymization Technique for Privacy Preserved Datasets Publication on Cloud Computing

    Science.gov (United States)

    Aldeen Yousra, S.; Mazleena, Salleh

    2018-05-01

    Recent advancement in Information and Communication Technologies (ICT) demanded much of cloud services to sharing users’ private data. Data from various organizations are the vital information source for analysis and research. Generally, this sensitive or private data information involves medical, census, voter registration, social network, and customer services. Primary concern of cloud service providers in data publishing is to hide the sensitive information of individuals. One of the cloud services that fulfill the confidentiality concerns is Privacy Preserving Data Mining (PPDM). The PPDM service in Cloud Computing (CC) enables data publishing with minimized distortion and absolute privacy. In this method, datasets are anonymized via generalization to accomplish the privacy requirements. However, the well-known privacy preserving data mining technique called K-anonymity suffers from several limitations. To surmount those shortcomings, I propose a new heuristic anonymization framework for preserving the privacy of sensitive datasets when publishing on cloud. The advantages of K-anonymity, L-diversity and (α, k)-anonymity methods for efficient information utilization and privacy protection are emphasized. Experimental results revealed the superiority and outperformance of the developed technique than K-anonymity, L-diversity, and (α, k)-anonymity measure.

  9. ACUTRI: a computer code for assessing doses to the general public due to acute tritium releases

    Energy Technology Data Exchange (ETDEWEB)

    Yokoyama, Sumi; Noguchi, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Ryufuku, Susumu; Sasaki, Toshihisa; Kurosawa, Naohiro [Visible Information Center, Inc., Tokai, Ibaraki (Japan)

    2002-11-01

    Tritium, which is used as a fuel of a D-T burning fusion reactor, is the most important radionuclide for the safety assessment of a nuclear fusion experimental reactor such as ITER. Thus, a computer code, ACUTRI, which calculates the radiological impact of tritium released accidentally to the atmosphere, has been developed, aiming to be of use in a discussion of licensing of a fusion experimental reactor and an environmental safety evaluation method in Japan. ACUTRI calculates an individual tritium dose based on transfer models specific to tritium in the environment and ICRP dose models. In this calculation it is also possible to analyze statistically on meteorology in the same way as a conventional dose assessment method according to the meteorological guide of the Nuclear Safety Commission of Japan. A Gaussian plume model is used for calculating the atmospheric dispersion of tritium gas (HT) and/or tritiated water (HTO). The environmental pathway model in ACUTRI considers the following internal exposures: inhalation from a primary plume (HT and/or HTO) released from the facilities and inhalation from a secondary plume (HTO) reemitted from the ground following deposition of HT and HTO. This report describes an outline of the ACUTRI code, a user guide and the results of test calculation. (author)

  10. The Perceptions of Globalization at a Public Research University Computer Science Graduate Department

    Science.gov (United States)

    Nielsen, Selin Yildiz

    Based on a qualitative methodological approach, this study focuses on the understanding of a phenomenon called globalization in a research university computer science department. The study looks into the participants' perspectives about the department, its dynamics, culture and academic environment as related to globalization. The economic, political, academic and social/cultural aspects of the department are taken into consideration in investigating the influences of globalization. Three questions guide this inquiry: 1) How is the notion of globalization interpreted in this department? 2) How does the perception of globalization influence the department in terms of finances, academics, policies and social life And 3) How are these perceptions influence the selection of students? Globalization and neo-institutional view of legitimacy is used as theoretical lenses to conceptualize responses to these questions. The data include interviews, field notes, official and non-official documents. Interpretations of these data are compared to findings from prior research on the impact of globalization in order to clarify and validate findings. Findings show that there is disagreement in how the notion of globalization is interpreted between the doctoral students and the faculty in the department. This disagreement revealed the attitudes and interpretations of globalization in the light of the policies and procedures related to the department. How the faculty experience globalization is not consistent with the literature in this project. The literature states that globalization is a big part of higher education and it is a phenomenon that causes the changes in the goals and missions of higher education institutions (Knight, 2003, De Witt, 2005). The data revealed that globalization is not the cause for change but more of a consequence of actions that take place in achieving the goals and missions of the department.

  11. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  12. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  14. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  15. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  19. Reference computations of public dose and cancer risk from airborne releases of uranium and Class W plutonium

    International Nuclear Information System (INIS)

    Peterson, V.L.

    1995-01-01

    This report presents ''reference'' computations that can be used by safety analysts in the evaluations of the consequences of postulated atmospheric releases of radionuclides from the Rocky Flats Environmental Technology Site. These computations deal specifically with doses and health risks to the public. The radionuclides considered are Class W Plutonium, all classes of Enriched Uranium, and all classes of Depleted Uranium. (The other class of plutonium, Y, was treated in an earlier report.) In each case, one gram of the respirable material is assumed to be released at ground leveL both with and without fire. The resulting doses and health risks can be scaled to whatever amount of release is appropriate for a postulated accident being investigated. The report begins with a summary of the organ-specific stochastic risk factors appropriate for alpha radiation, which poses the main health risk of plutonium and uranium. This is followed by a summary of the atmospheric dispersion factors for unfavorable and typical weather conditions for the calculation of consequences to both the Maximum Offsite Individual and the general population within 80 km (50 miles) of the site

  20. Computer-aided detection of pulmonary nodules: a comparative study using the public LIDC/IDRI database

    International Nuclear Information System (INIS)

    Jacobs, Colin; Prokop, Mathias; Rikxoort, Eva M. van; Ginneken, Bram van; Murphy, Keelin; Schaefer-Prokop, Cornelia M.

    2016-01-01

    To benchmark the performance of state-of-the-art computer-aided detection (CAD) of pulmonary nodules using the largest publicly available annotated CT database (LIDC/IDRI), and to show that CAD finds lesions not identified by the LIDC's four-fold double reading process. The LIDC/IDRI database contains 888 thoracic CT scans with a section thickness of 2.5 mm or lower. We report performance of two commercial and one academic CAD system. The influence of presence of contrast, section thickness, and reconstruction kernel on CAD performance was assessed. Four radiologists independently analyzed the false positive CAD marks of the best CAD system. The updated commercial CAD system showed the best performance with a sensitivity of 82 % at an average of 3.1 false positive detections per scan. Forty-five false positive CAD marks were scored as nodules by all four radiologists in our study. On the largest publicly available reference database for lung nodule detection in chest CT, the updated commercial CAD system locates the vast majority of pulmonary nodules at a low false positive rate. Potential for CAD is substantiated by the fact that it identifies pulmonary nodules that were not marked during the extensive four-fold LIDC annotation process. (orig.)

  1. Expanding the landscape of chromatin modification (CM-related functional domains and genes in human.

    Directory of Open Access Journals (Sweden)

    Shuye Pu

    2010-11-01

    Full Text Available Chromatin modification (CM plays a key role in regulating transcription, DNA replication, repair and recombination. However, our knowledge of these processes in humans remains very limited. Here we use computational approaches to study proteins and functional domains involved in CM in humans. We analyze the abundance and the pair-wise domain-domain co-occurrences of 25 well-documented CM domains in 5 model organisms: yeast, worm, fly, mouse and human. Results show that domains involved in histone methylation, DNA methylation, and histone variants are remarkably expanded in metazoan, reflecting the increased demand for cell type-specific gene regulation. We find that CM domains tend to co-occur with a limited number of partner domains and are hence not promiscuous. This property is exploited to identify 47 potentially novel CM domains, including 24 DNA-binding domains, whose role in CM has received little attention so far. Lastly, we use a consensus Machine Learning approach to predict 379 novel CM genes (coding for 329 proteins in humans based on domain compositions. Several of these predictions are supported by very recent experimental studies and others are slated for experimental verification. Identification of novel CM genes and domains in humans will aid our understanding of fundamental epigenetic processes that are important for stem cell differentiation and cancer biology. Information on all the candidate CM domains and genes reported here is publicly available.

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  3. Artificial proteins as allosteric modulators of PDZ3 and SH3 in two-domain constructs: A computational characterization of novel chimeric proteins

    Czech Academy of Sciences Publication Activity Database

    Palani, K.; Pfeiferová, L.; Boušová, Kristýna; Bednárová, L.; Obšilová, Veronika; Vondrášek, J.

    2016-01-01

    Roč. 84, č. 10 (2016), s. 1358-1374 ISSN 0887-3585 Institutional support: RVO:67985823 Keywords : protein design * fusion proteins * PDZ3 * SH3 * Trp-cage * two domain proteins * molecular dynamics simulation * circular dichroism Subject RIV: BO - Biophysics Impact factor: 2.289, year: 2016

  4. Artificial proteins as allosteric modulators of PDZ3 and SH3 in two-domain constructs: A computational characterization of novel chimeric proteins

    Czech Academy of Sciences Publication Activity Database

    Palani, Kirubakaran; Pfeiferová, Lucie; Boušová, Kristýna; Bednárová, Lucie; Obšilová, V.; Vondrášek, Jiří

    2016-01-01

    Roč. 84, č. 10 (2016), s. 1358-1374 ISSN 0887-3585 Institutional support: RVO:61388963 Keywords : protein design * fusion proteins * PDZ3 * SH3 * Trp-cage * two domain proteins Subject RIV: CE - Biochemistry Impact factor: 2.289, year: 2016

  5. Computational dissection of allosteric inhibition of the SH2 domain of Bcr-Abl kinase by the monobody inhibitor AS25.

    Science.gov (United States)

    Ji, Mingfei; Zheng, Guodong; Li, Xiaolong; Zhang, Zhongqin; Jv, Guanqun; Wang, Xiaowei; Wang, Jialin

    2017-06-01

    The deregulated breakpoint cluster region (Bcr)-Abelson tyrosine kinase (Abl) fusion protein represents an attractive pharmacological target for the treatment of chronic myeloid leukemia (CML). The high affinity of monobody AS25 was designed to target the Src homology 2 (SH2) domain of Bcr-Abl, leading to allosteric inhibition of Bcr-Abl through formation of protein-protein interactions. An I164E mutation in the SH2 domain disrupts AS25 binding to the SH2 domain of Bcr-Abl. The detailed mechanisms, however, remain to be unresolved. Here, molecular dynamics (MD) simulations and binding free energy calculations were performed to explore the conformational and energetic differences between the wild-type (WT) complexes of Bcr-Abl SH2 domain and AS25 (SH2 WT -AS25) as well as the mutated complexes (SH2 I164E -AS25). The results revealed that I164E mutation not only caused an increase in the conformational flexibility of SH2-AS25 complexes, but also weakened the binding affinity of AS25 to SH2. The comparative binding modes of SH2-AS25 complexes between WT and the I164E mutant were comprehensively analyzed to unravel the disruption of hydrophobic and hydrogen bonding interactions in the interface of the SH2-AS25 complex triggered by the I164E mutation. The results obtained may help to design the next generation of higher affinity Bcr-Abl SH2-specific peptide inhibitors.

  6. A Computational Study of the Oligosaccharide Binding Sites in the Lectin-Like Domain of Tumor Necrosis Factor and the TNF-derived TIP Peptide

    Czech Academy of Sciences Publication Activity Database

    Dulebo, A.; Ettrich, Rüdiger; Lucas, R.; Kaftan, D.

    2012-01-01

    Roč. 18, č. 27 (2012), s. 4236-4243 ISSN 1381-6128 Institutional support: RVO:67179843 Keywords : lectin-like domain * tumor necrosis factor * TIP peptide * oligosaccharides * molecular docking * molecular dynamics simulation * alveolar liquid clearance Subject RIV: CE - Biochemistry Impact factor: 3.311, year: 2012

  7. Determining the role of missense mutations in the POU domain of HNF1A that reduce the DNA-binding affinity: A computational approach.

    Directory of Open Access Journals (Sweden)

    Sneha P

    Full Text Available Maturity-onset diabetes of the young type 3 (MODY3 is a non-ketotic form of diabetes associated with poor insulin secretion. Over the past years, several studies have reported the association of missense mutations in the Hepatocyte Nuclear Factor 1 Alpha (HNF1A with MODY3. Missense mutations in the POU homeodomain (POUH of HNF1A hinder binding to the DNA, thereby leading to a dysfunctional protein. Missense mutations of the HNF1A were retrieved from public databases and subjected to a three-step computational mutational analysis to identify the underlying mechanism. First, the pathogenicity and stability of the mutations were analyzed to determine whether they alter protein structure and function. Second, the sequence conservation and DNA-binding sites of the mutant positions were assessed; as HNF1A protein is a transcription factor. Finally, the biochemical properties of the biological system were validated using molecular dynamic simulations in Gromacs 4.6.3 package. Two arginine residues (131 and 203 in the HNF1A protein are highly conserved residues and contribute to the function of the protein. Furthermore, the R131W, R131Q, and R203C mutations were predicted to be highly deleterious by in silico tools and showed lower binding affinity with DNA when compared to the native protein using the molecular docking analysis. Triplicate runs of molecular dynamic (MD simulations (50ns revealed smaller changes in patterns of deviation, fluctuation, and compactness, in complexes containing the R131Q and R131W mutations, compared to complexes containing the R203C mutant complex. We observed reduction in the number of intermolecular hydrogen bonds, compactness, and electrostatic potential, as well as the loss of salt bridges, in the R203C mutant complex. Substitution of arginine with cysteine at position 203 decreases the affinity of the protein for DNA, thereby destabilizing the protein. Based on our current findings, the MD approach is an important

  8. Domain crossing

    DEFF Research Database (Denmark)

    Schraefel, M. C.; Rouncefield, Mark; Kellogg, Wendy

    2012-01-01

    In CSCW, how much do we need to know about another domain/culture before we observe, intersect and intervene with designs. What optimally would that other culture need to know about us? Is this a “how long is a piece of string” question, or an inquiry where we can consider a variety of contexts a...

  9. Wavefield extrapolation in pseudodepth domain

    KAUST Repository

    Ma, Xuxin

    2013-02-01

    Wavefields are commonly computed in the Cartesian coordinate frame. Its efficiency is inherently limited due to spatial oversampling in deep layers, where the velocity is high and wavelengths are long. To alleviate this computational waste due to uneven wavelength sampling, we convert the vertical axis of the conventional domain from depth to vertical time or pseudodepth. This creates a nonorthognal Riemannian coordinate system. Isotropic and anisotropic wavefields can be extrapolated in the new coordinate frame with improved efficiency and good consistency with Cartesian domain extrapolation results. Prestack depth migrations are also evaluated based on the wavefield extrapolation in the pseudodepth domain.© 2013 Society of Exploration Geophysicists. All rights reserved.

  10. Domain wall networks on solitons

    International Nuclear Information System (INIS)

    Sutcliffe, Paul

    2003-01-01

    Domain wall networks on the surface of a soliton are studied in a simple theory. It consists of two complex scalar fields, in 3+1 dimensions, with a global U(1)xZ n symmetry, where n>2. Solutions are computed numerically in which one of the fields forms a Q ball and the other field forms a network of domain walls localized on the surface of the Q ball. Examples are presented in which the domain walls lie along the edges of a spherical polyhedron, forming junctions at its vertices. It is explained why only a small restricted class of polyhedra can arise as domain wall networks

  11. Using computer-assisted process facilitation techniques in government sponsored public meetings and working sessions - a paper addressing the East Fork Poplar Creek Working Group Experience

    International Nuclear Information System (INIS)

    Armstrong, L.D.; Rymer, G.; Perkins, S.

    1994-01-01

    This paper addresses a process facilitation technique using computer hardware and software that assists its users in group decision-making, consensus building, surveying and polling, and strategic planning. The process and equipment has been successfully used by the Department of Energy and Martin Marietta Energy Systems, Inc., Environmental Restoration and Waste Management Community Relations program. The technology is used to solicit and encourage qualitative and documented public feedback in government mandated or sponsored public meetings in Oak Ridge, Tennessee

  12. Computational analysis of phosphopeptide binding to the polo-box domain of the mitotic kinase PLK1 using molecular dynamics simulation.

    Directory of Open Access Journals (Sweden)

    David J Huggins

    2010-08-01

    Full Text Available The Polo-Like Kinase 1 (PLK1 acts as a central regulator of mitosis and is over-expressed in a wide range of human tumours where high levels of expression correlate with a poor prognosis. PLK1 comprises two structural elements, a kinase domain and a polo-box domain (PBD. The PBD binds phosphorylated substrates to control substrate phosphorylation by the kinase domain. Although the PBD preferentially binds to phosphopeptides, it has a relatively broad sequence specificity in comparison with other phosphopeptide binding domains. We analysed the molecular determinants of recognition by performing molecular dynamics simulations of the PBD with one of its natural substrates, CDC25c. Predicted binding free energies were calculated using a molecular mechanics, Poisson-Boltzmann surface area approach. We calculated the per-residue contributions to the binding free energy change, showing that the phosphothreonine residue and the mainchain account for the vast majority of the interaction energy. This explains the very broad sequence specificity with respect to other sidechain residues. Finally, we considered the key role of bridging water molecules at the binding interface. We employed inhomogeneous fluid solvation theory to consider the free energy of water molecules on the protein surface with respect to bulk water molecules. Such an analysis highlights binding hotspots created by elimination of water molecules from hydrophobic surfaces. It also predicts that a number of water molecules are stabilized by the presence of the charged phosphate group, and that this will have a significant effect on the binding affinity. Our findings suggest a molecular rationale for the promiscuous binding of the PBD and highlight a role for bridging water molecules at the interface. We expect that this method of analysis will be very useful for probing other protein surfaces to identify binding hotspots for natural binding partners and small molecule inhibitors.

  13. Trusted Domain

    DEFF Research Database (Denmark)

    Hjorth, Theis Solberg; Torbensen, Rune

    2012-01-01

    remote access via IP-based devices such as smartphones. The Trusted Domain platform fits existing legacy technologies by managing their interoperability and access controls, and it seeks to avoid the security issues of relying on third-party servers outside the home. It is a distributed system...... of wireless standards, limited resources of embedded systems, etc. Taking these challenges into account, we present a Trusted Domain home automation platform, which dynamically and securely connects heterogeneous networks of Short-Range Wireless devices via simple non-expert user. interactions, and allows......In the digital age of home automation and with the proliferation of mobile Internet access, the intelligent home and its devices should be accessible at any time from anywhere. There are many challenges such as security, privacy, ease of configuration, incompatible legacy devices, a wealth...

  14. Quality criteria for electronic publications in medicine.

    Science.gov (United States)

    Schulz, S; Auhuber, T; Schrader, U; Klar, R

    1998-01-01

    This paper defines "electronic publications in medicine (EPM)" as computer based training programs, databases, knowledge-based systems, multimedia applications and electronic books running on standard platforms and available by usual distribution channels. A detailed catalogue of quality criteria as a basis for development and evaluation of EPMs is presented. The necessity to raise the quality level of electronic publications is stressed considering aspects of domain knowledge, software engineering, media development, interface design and didactics.

  15. Summative report of the public competition research and development on software for computational science and engineering in the fiscal year 1997 through 2002

    International Nuclear Information System (INIS)

    2005-09-01

    Japan Atomic Energy Research Institute started the public competition research and development on software for computational science and engineering in 1997, and closed it in 2002. This report describes the system of the competition research and development, application situations, R and D subjects adopted, evaluation findings, outputs produced, achievements and problems, as a summative report of practice of the system for six years. (author)

  16. L'apprentissage des langues médiatisé par les technologies (ALMT – Étude d'un domaine de recherche émergent à travers les publications de la revue Alsic Technology-mediated language learning: an emergent research domain under study through the review of a French scientific journal's publications

    Directory of Open Access Journals (Sweden)

    Nicolas Guichon

    2012-11-01

    Full Text Available Dans cette étude, il est postulé que l'apprentissage des langues médiatisé par les technologies (ALMT est un domaine de recherche qui s'intéresse au développement et à l'intégration des technologies dans l'enseignement-apprentissage d'une langue. Ce domaine étant émergent, la présente recherche vise tout d'abord à comprendre comment s'est formée la communauté de chercheurs autour de cet objet. Puis, à travers l'analyse critique de 79 articles publiés dans la revue en ligne francophone Alsic entre 1998 et 2010, la présente contribution s'emploie à définir les contours épistémologiques de ce domaine en étudiant les moyens de production de connaissance.In this study, it is postulated that technology mediated language learning is a research domain that focuses on the design and integration of technologies for language learning and teaching. Because this domain is emergent, the present study first aims at understanding how a community of researchers has developed around this object. Then, thanks to the critical analysis of 79 articles published in Alsic, a French-speaking online journal, the present article endeavours to define the epistemological contours of this research domain by studying the means employed to produce knowledge.

  17. Development of Computer Science Disciplines - A Social Network Analysis Approach

    OpenAIRE

    Pham, Manh Cuong; Klamma, Ralf; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and ...

  18. Computational analysis of siRNA recognition by the Ago2 PAZ domain and identification of the determinants of RNA-induced gene silencing.

    Directory of Open Access Journals (Sweden)

    Mahmoud Kandeel

    Full Text Available RNA interference (RNAi is a highly specialized process of protein-siRNA interaction that results in the regulation of gene expression and cleavage of target mRNA. The PAZ domain of the Argonaute proteins binds to the 3' end of siRNA, and during RNAi the attaching end of the siRNA switches between binding and release from its binding pocket. This biphasic interaction of the 3' end of siRNA with the PAZ domain is essential for RNAi activity; however, it remains unclear whether stronger or weaker binding with PAZ domain will facilitate or hinder the overall RNAi process. Here we report the correlation between the binding of modified siRNA 3' overhang analogues and their in vivo RNAi efficacy. We found that higher RNAi efficacy was associated with the parameters of lower Ki value, lower total intermolecular energy, lower free energy, higher hydrogen bonding, smaller total surface of interaction and fewer van der Waals interactions. Electrostatic interaction was a minor contributor to compounds recognition, underscoring the presence of phosphate groups in the modified analogues. Thus, compounds with lower binding affinity are associated with better gene silencing. Lower binding strength along with the smaller interaction surface, higher hydrogen bonding and fewer van der Waals interactions were among the markers for favorable RNAi activity. Within the measured parameters, the interaction surface, van der Waals interactions and inhibition constant showed a statistically significant correlation with measured RNAi efficacy. The considerations provided in this report will be helpful in the design of new compounds with better gene silencing ability.

  19. Opening of energy markets: consequences on the missions of public utility and of security of supplies in the domain of electric power and gas; Ouverture des marches energetiques: consequences sur les missions de service public et de securite d'approvisionnement pour l'electricite et le gaz

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    This conference was jointly organized by the International Energy Agency (IEA) and the French ministry of economy, finances, and industry (general direction of energy and raw materials, DGEMP). It was organized in 6 sessions dealing with: 1 - the public utility in the domain of energy: definition of the public utility missions, experience feedback about liberalized markets, public utility obligation and pricing regulation; 2 - the new US energy policy and the lessons learnt from the California crisis; 3 - the security of electric power supplies: concepts of security of supplies, opinion of operators, security of power supplies versus liberalization and investments; 4 - security of gas supplies: markets liberalization and investments, long-term contracts and security of supplies; 5 - debate: how to integrate the objectives of public utility and of security of supplies in a competing market; 6 - conclusions. This document brings together the available talks and transparencies presented at the conference. (J.S.)

  20. Factors influencing health professions students' use of computers for data analysis at three Ugandan public medical schools: a cross-sectional survey.

    Science.gov (United States)

    Munabi, Ian G; Buwembo, William; Bajunirwe, Francis; Kitara, David Lagoro; Joseph, Ruberwa; Peter, Kawungezi; Obua, Celestino; Quinn, John; Mwaka, Erisa S

    2015-02-25

    Effective utilization of computers and their applications in medical education and research is of paramount importance to students. The objective of this study was to determine the association between owning a computer and use of computers for research data analysis and the other factors influencing health professions students' computer use for data analysis. We conducted a cross sectional study among undergraduate health professions students at three public universities in Uganda using a self-administered questionnaire. The questionnaire was composed of questions on participant demographics, students' participation in research, computer ownership, and use of computers for data analysis. Descriptive and inferential statistics (uni-variable and multi- level logistic regression analysis) were used to analyse data. The level of significance was set at 0.05. Six hundred (600) of 668 questionnaires were completed and returned (response rate 89.8%). A majority of respondents were male (68.8%) and 75.3% reported owning computers. Overall, 63.7% of respondents reported that they had ever done computer based data analysis. The following factors were significant predictors of having ever done computer based data analysis: ownership of a computer (adj. OR 1.80, p = 0.02), recently completed course in statistics (Adj. OR 1.48, p =0.04), and participation in research (Adj. OR 2.64, p <0.01). Owning a computer, participation in research and undertaking courses in research methods influence undergraduate students' use of computers for research data analysis. Students are increasingly participating in research, and thus need to have competencies for the successful conduct of research. Medical training institutions should encourage both curricular and extra-curricular efforts to enhance research capacity in line with the modern theories of adult learning.

  1. Viewing speech in action: speech articulation videos in the public domain that demonstrate the sounds of the International Phonetic Alphabet (IPA)

    OpenAIRE

    Nakai, S.; Beavan, D.; Lawson, E.; Leplâtre, G.; Scobbie, J. M.; Stuart-Smith, J.

    2016-01-01

    In this article, we introduce recently released, publicly available resources, which allow users to watch videos of hidden articulators (e.g. the tongue) during the production of various types of sounds found in the world’s languages. The articulation videos on these resources are linked to a clickable International Phonetic Alphabet chart ([International Phonetic Association. 1999. Handbook of the International Phonetic Association: A Guide to the Use of the International Phonetic Alphabet. ...

  2. Predictors of Biased Self-perception in Individuals with High Social Anxiety: The Effect of Self-consciousness in the Private and Public Self Domains

    Directory of Open Access Journals (Sweden)

    Henrik Nordahl

    2017-07-01

    Full Text Available “Biased self-perception,” the tendency to perceive one’s social performance as more negative than observers do, is characteristic of socially anxious individuals. Self-attention processes are hypothesised to underlie biased self-perception, however, different models emphasise different aspects of self-attention, with attention to the public aspects of the self being prominent. The current study aimed to investigate the relative contribution of two types of dispositional self-attention; public- and private self-consciousness to biased self-perception in a high (n = 48 versus a low (n = 48 social anxiety group undergoing an interaction task. The main finding was that private self-consciousness explained substantial and unique variance in biased negative self-perception in individuals with high social anxiety, while public self-consciousness did not. This relationship was independent of increments in state anxiety. Private self-consciousness appeared to have a specific association with bias related to overestimation of negative social performance rather than underestimation of positive social performance. The implication of this finding is that current treatment models of Social anxiety disorder might include broader aspects of self-focused attention, especially in the context of formulating self-evaluation biases.

  3. Predictors of Biased Self-perception in Individuals with High Social Anxiety: The Effect of Self-consciousness in the Private and Public Self Domains.

    Science.gov (United States)

    Nordahl, Henrik; Plummer, Alice; Wells, Adrian

    2017-01-01

    "Biased self-perception," the tendency to perceive one's social performance as more negative than observers do, is characteristic of socially anxious individuals. Self-attention processes are hypothesised to underlie biased self-perception, however, different models emphasise different aspects of self-attention, with attention to the public aspects of the self being prominent. The current study aimed to investigate the relative contribution of two types of dispositional self-attention; public- and private self-consciousness to biased self-perception in a high ( n = 48) versus a low ( n = 48) social anxiety group undergoing an interaction task. The main finding was that private self-consciousness explained substantial and unique variance in biased negative self-perception in individuals with high social anxiety, while public self-consciousness did not. This relationship was independent of increments in state anxiety. Private self-consciousness appeared to have a specific association with bias related to overestimation of negative social performance rather than underestimation of positive social performance. The implication of this finding is that current treatment models of Social anxiety disorder might include broader aspects of self-focused attention, especially in the context of formulating self-evaluation biases.

  4. 78 FR 54453 - Notice of Public Meeting-Intersection of Cloud Computing and Mobility Forum and Workshop

    Science.gov (United States)

    2013-09-04

    ...--Intersection of Cloud Computing and Mobility Forum and Workshop AGENCY: National Institute of Standards and.../intersection-of-cloud-and-mobility.cfm . SUPPLEMENTARY INFORMATION: NIST hosted six prior Cloud Computing Forum... interoperability, portability, and security, discuss the Federal Government's experience with cloud computing...

  5. The effects of computer-based mindfulness training on Self-control and Mindfulness within Ambulatorily assessed network Systems across Health-related domains in a healthy student population (SMASH): study protocol for a randomized controlled trial.

    Science.gov (United States)

    Rowland, Zarah; Wenzel, Mario; Kubiak, Thomas

    2016-12-01

    Self-control is an important ability in everyday life, showing associations with health-related outcomes. The aim of the Self-control and Mindfulness within Ambulatorily assessed network Systems across Health-related domains (SMASH) study is twofold: first, the effectiveness of a computer-based mindfulness training will be evaluated in a randomized controlled trial. Second, the SMASH study implements a novel network approach in order to investigate complex temporal interdependencies of self-control networks across several domains. The SMASH study is a two-armed, 6-week, non-blinded randomized controlled trial that combines seven weekly laboratory meetings and 40 days of electronic diary assessments with six prompts per day in a healthy undergraduate student population at the Johannes Gutenberg University Mainz, Germany. Participants will be randomly assigned to (1) receive a computer-based mindfulness intervention or (2) to a wait-list control condition. Primary outcomes are self-reported momentary mindfulness and self-control assessed via electronic diaries. Secondary outcomes are habitual mindfulness and habitual self-control. Further measures include self-reported behaviors in specific self-control domains: emotion regulation, alcohol consumption and eating behaviors. The effects of mindfulness training on primary and secondary outcomes are explored using three-level mixed models. Furthermore, networks will be computed with vector autoregressive mixed models to investigate the dynamics at participant and group level. This study was approved by the local ethics committee (reference code 2015_JGU_psychEK_011) and follows the standards laid down in the Declaration of Helsinki (2013). This randomized controlled trial combines an intensive Ambulatory Assessment of 40 consecutive days and seven laboratory meetings. By implementing a novel network approach, underlying processes of self-control within different health domains will be identified. These results will

  6. Computational modeling of the bHLH domain of the transcription factor TWIST1 and R118C, S144R and K145E mutants

    Directory of Open Access Journals (Sweden)

    Maia Amanda M

    2012-07-01

    Full Text Available Abstract Background Human TWIST1 is a highly conserved member of the regulatory basic helix-loop-helix (bHLH transcription factors. TWIST1 forms homo- or heterodimers with E-box proteins, such as E2A (isoforms E12 and E47, MYOD and HAND2. Haploinsufficiency germ-line mutations of the twist1 gene in humans are the main cause of Saethre-Chotzen syndrome (SCS, which is characterized by limb abnormalities and premature fusion of cranial sutures. Because of the importance of TWIST1 in the regulation of embryonic development and its relationship with SCS, along with the lack of an experimentally solved 3D structure, we performed comparative modeling for the TWIST1 bHLH region arranged into wild-type homodimers and heterodimers with E47. In addition, three mutations that promote DNA binding failure (R118C, S144R and K145E were studied on the TWIST1 monomer. We also explored the behavior of the mutant forms in aqueous solution using molecular dynamics (MD simulations, focusing on the structural changes of the wild-type versus mutant dimers. Results The solvent-accessible surface area of the homodimers was smaller on wild-type dimers, which indicates that the cleft between the monomers remained more open on the mutant homodimers. RMSD and RMSF analyses indicated that mutated dimers presented values that were higher than those for the wild-type dimers. For a more careful investigation, the monomer was subdivided into four regions: basic, helix I, loop and helix II. The basic domain presented a higher flexibility in all of the parameters that were analyzed, and the mutant dimer basic domains presented values that were higher than the wild-type dimers. The essential dynamic analysis also indicated a higher collective motion for the basic domain. Conclusions Our results suggest the mutations studied turned the dimers into more unstable structures with a wider cleft, which may be a reason for the loss of DNA binding capacity observed for in vitro

  7. Cloud Computing for Complex Performance Codes.

    Energy Technology Data Exchange (ETDEWEB)

    Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Klein, Brandon Thorin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miner, John Gifford [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  8. Independent and arbitrary generation of spots in the 3D space domain with computer generated holograms written on a phase-only liquid crystal spatial light modulator

    International Nuclear Information System (INIS)

    Wang, Dong; Zhang, Jian; Xia, Yang; Wang, Hao

    2012-01-01

    An improved multiple independent iterative plane algorithm, based on a projection optimization idea, is proposed for the independent and arbitrary generation of one spot or multiple spots in a speckle-suppressed 3D work-area. Details of the mathematical expressions of the algorithm are given to theoretically show how it is improved for 3D spot generation. Both simulations and experiments are conducted to investigate the performance of the algorithm for independent and arbitrary 3D spot generation in several different cases. Simulation results agree well with experimental results, which validates the effectiveness of the algorithm proposed. Several additional experiments are demonstrated for fast and independent generation of four or more spots in the 3D space domain, which confirms the capabilities and practicalities of the algorithm further. (paper)

  9. A computational method for the coupled solution of reaction-diffusion equations on evolving domains and manifolds: Application to a model of cell migration and chemotaxis.

    Science.gov (United States)

    MacDonald, G; Mackenzie, J A; Nolan, M; Insall, R H

    2016-03-15

    In this paper, we devise a moving mesh finite element method for the approximate solution of coupled bulk-surface reaction-diffusion equations on an evolving two dimensional domain. Fundamental to the success of the method is the robust generation of bulk and surface meshes. For this purpose, we use a novel moving mesh partial differential equation (MMPDE) approach. The developed method is applied to model problems with known analytical solutions; these experiments indicate second-order spatial and temporal accuracy. Coupled bulk-surface problems occur frequently in many areas; in particular, in the modelling of eukaryotic cell migration and chemotaxis. We apply the method to a model of the two-way interaction of a migrating cell in a chemotactic field, where the bulk region corresponds to the extracellular region and the surface to the cell membrane.

  10. Toward fish and seafood traceability: anchovy species determination in fish products by molecular markers and support through a public domain database.

    Science.gov (United States)

    Jérôme, Marc; Martinsohn, Jann Thorsten; Ortega, Delphine; Carreau, Philippe; Verrez-Bagnis, Véronique; Mouchel, Olivier

    2008-05-28

    Traceability in the fish food sector plays an increasingly important role for consumer protection and confidence building. This is reflected by the introduction of legislation and rules covering traceability on national and international levels. Although traceability through labeling is well established and supported by respective regulations, monitoring and enforcement of these rules are still hampered by the lack of efficient diagnostic tools. We describe protocols using a direct sequencing method based on 212-274-bp diagnostic sequences derived from species-specific mitochondria DNA cytochrome b, 16S rRNA, and cytochrome oxidase subunit I sequences which can efficiently be applied to unambiguously determine even closely related fish species in processed food products labeled "anchovy". Traceability of anchovy-labeled products is supported by the public online database AnchovyID ( http://anchovyid.jrc.ec.europa.eu), which provided data obtained during our study and tools for analytical purposes.

  11. Prion-Like Domains in Phagobiota

    Directory of Open Access Journals (Sweden)

    George Tetz

    2017-11-01

    Full Text Available Prions are molecules characterized by self-propagation, which can undergo a conformational switch leading to the creation of new prions. Prion proteins have originally been associated with the development of mammalian pathologies; however, recently they have been shown to contribute to the environmental adaptation in a variety of prokaryotic and eukaryotic organisms. Bacteriophages are widespread and represent the important regulators of microbiota homeostasis and have been shown to be diverse across various bacterial families. Here, we examined whether bacteriophages contain prion-like proteins and whether these prion-like protein domains are involved in the regulation of homeostasis. We used a computational algorithm, prion-like amino acid composition, to detect prion-like domains in 370,617 publicly available bacteriophage protein sequences, which resulted in the identification of 5040 putative prions. We analyzed a set of these prion-like proteins, and observed regularities in their distribution across different phage families, associated with their interactions with the bacterial host cells. We found that prion-like domains could be found across all phages of various groups of bacteria and archaea. The results obtained in this study indicate that bacteriophage prion-like proteins are predominantly involved in the interactions between bacteriophages and bacterial cell, such as those associated with the attachment and penetration of bacteriophage in the cell, and the release of the phage progeny. These data allow the identification of phage prion-like proteins as novel regulators of the interactions between bacteriophages and bacterial cells.

  12. Computer architecture a quantitative approach

    CERN Document Server

    Hennessy, John L

    2019-01-01

    Computer Architecture: A Quantitative Approach, Sixth Edition has been considered essential reading by instructors, students and practitioners of computer design for over 20 years. The sixth edition of this classic textbook is fully revised with the latest developments in processor and system architecture. It now features examples from the RISC-V (RISC Five) instruction set architecture, a modern RISC instruction set developed and designed to be a free and openly adoptable standard. It also includes a new chapter on domain-specific architectures and an updated chapter on warehouse-scale computing that features the first public information on Google's newest WSC. True to its original mission of demystifying computer architecture, this edition continues the longstanding tradition of focusing on areas where the most exciting computing innovation is happening, while always keeping an emphasis on good engineering design.

  13. DIMA 3.0: Domain Interaction Map.

    Science.gov (United States)

    Luo, Qibin; Pagel, Philipp; Vilne, Baiba; Frishman, Dmitrij

    2011-01-01

    Domain Interaction MAp (DIMA, available at http://webclu.bio.wzw.tum.de/dima) is a database of predicted and known interactions between protein domains. It integrates 5807 structurally known interactions imported from the iPfam and 3did databases and 46,900 domain interactions predicted by four computational methods: domain phylogenetic profiling, domain pair exclusion algorithm correlated mutations and domain interaction prediction in a discriminative way. Additionally predictions are filtered to exclude those domain pairs that are reported as non-interacting by the Negatome database. The DIMA Web site allows to calculate domain interaction networks either for a domain of interest or for entire organisms, and to explore them interactively using the Flash-based Cytoscape Web software.

  14. Computational split-field finite-difference time-domain evaluation of simplified tilt-angle models for parallel-aligned liquid-crystal devices

    Science.gov (United States)

    Márquez, Andrés; Francés, Jorge; Martínez, Francisco J.; Gallego, Sergi; Álvarez, Mariela L.; Calzado, Eva M.; Pascual, Inmaculada; Beléndez, Augusto

    2018-03-01

    Simplified analytical models with predictive capability enable simpler and faster optimization of the performance in applications of complex photonic devices. We recently demonstrated the most simplified analytical model still showing predictive capability for parallel-aligned liquid crystal on silicon (PA-LCoS) devices, which provides the voltage-dependent retardance for a very wide range of incidence angles and any wavelength in the visible. We further show that the proposed model is not only phenomenological but also physically meaningful, since two of its parameters provide the correct values for important internal properties of these devices related to the birefringence, cell gap, and director profile. Therefore, the proposed model can be used as a means to inspect internal physical properties of the cell. As an innovation, we also show the applicability of the split-field finite-difference time-domain (SF-FDTD) technique for phase-shift and retardance evaluation of PA-LCoS devices under oblique incidence. As a simplified model for PA-LCoS devices, we also consider the exact description of homogeneous birefringent slabs. However, we show that, despite its higher degree of simplification, the proposed model is more robust, providing unambiguous and physically meaningful solutions when fitting its parameters.

  15. New developments in delivering public access to data from the National Center for Computational Toxicology at the EPA

    Science.gov (United States)

    Researchers at EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The goal of this researc...

  16. .Gov Domains API

    Data.gov (United States)

    General Services Administration — This dataset offers the list of all .gov domains, including state, local, and tribal .gov domains. It does not include .mil domains, or other federal domains outside...

  17. Evaluation to Obtain the Image According to the Spatial Domain Filtering of Various Convolution Kernels in the Multi-Detector Row Computed Tomography

    International Nuclear Information System (INIS)

    Lee, Hoo Min; Yoo, Beong Gyu; Kweon, Dae Cheol

    2008-01-01

    Our objective was to evaluate the image of spatial domain filtering as an alternative to additional image reconstruction using different kernels in MDCT. Derived from thin collimated source images were generated using water phantom and abdomen B10(very smooth), B20(smooth), B30(medium smooth), B40 (medium), B50(medium sharp), B60(sharp), B70(very sharp) and B80(ultra sharp) kernels. MTF and spatial resolution measured with various convolution kernels. Quantitative CT attenuation coefficient and noise measurements provided comparable HU(Hounsfield) units in this respect. CT attenuation coefficient(mean HU) values in the water were values in the water were 1.1∼1.8 HU, air(-998∼-1000 HU) and noise in the water(5.4∼44.8 HU), air(3.6∼31.4 HU). In the abdominal fat a CT attenuation coefficient(-2.2∼0.8 HU) and noise(10.1∼82.4 HU) was measured. In the abdominal was CT attenuation coefficient(53.3∼54.3 HU) and noise(10.4∼70.7 HU) in the muscle and in the liver parenchyma of CT attenuation coefficient(60.4∼62.2 HU) and noise (7.6∼63.8 HU) in the liver parenchyma. Image reconstructed with a convolution kernel led to an increase in noise, whereas the results for CT attenuation coefficient were comparable. Image scanned with a high convolution kernel(B80) led to an increase in noise, whereas the results for CT attenuation coefficient were comparable. Image medications of image sharpness and noise eliminate the need for reconstruction using different kernels in the future. Adjusting CT various kernels, which should be adjusted to take into account the kernels of the CT undergoing the examination, may control CT images increase the diagnostic accuracy.

  18. Optimizing security of cloud computing within the DoD

    OpenAIRE

    Antedomenico, Noemi

    2010-01-01

    Approved for public release; distribution is unlimited What countermeasures best strengthen the confidentiality, integrity and availability (CIA) of the implementation of cloud computing within the DoD? This question will be answered by analyzing threats and countermeasures within the context of the ten domains comprising the Certified Information System Security Professional (CISSP) Common Body of Knowledge (CBK). The ten domains that will be used in this analysis include access control; ...

  19. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  20. Time-domain modeling of electromagnetic diffusion with a frequency-domain code

    NARCIS (Netherlands)

    Mulder, W.A.; Wirianto, M.; Slob, E.C.

    2007-01-01

    We modeled time-domain EM measurements of induction currents for marine and land applications with a frequency-domain code. An analysis of the computational complexity of a number of numerical methods shows that frequency-domain modeling followed by a Fourier transform is an attractive choice if a

  1. External validation of a publicly available computer assisted diagnostic tool for mammographic mass lesions with two high prevalence research datasets.

    Science.gov (United States)

    Benndorf, Matthias; Burnside, Elizabeth S; Herda, Christoph; Langer, Mathias; Kotter, Elmar

    2015-08-01

    Lesions detected at mammography are described with a highly standardized terminology: the breast imaging-reporting and data system (BI-RADS) lexicon. Up to now, no validated semantic computer assisted classification algorithm exists to interactively link combinations of morphological descriptors from the lexicon to a probabilistic risk estimate of malignancy. The authors therefore aim at the external validation of the mammographic mass diagnosis (MMassDx) algorithm. A classification algorithm like MMassDx must perform well in a variety of clinical circumstances and in datasets that were not used to generate the algorithm in order to ultimately become accepted in clinical routine. The MMassDx algorithm uses a naïve Bayes network and calculates post-test probabilities of malignancy based on two distinct sets of variables, (a) BI-RADS descriptors and age ("descriptor model") and (b) BI-RADS descriptors, age, and BI-RADS assessment categories ("inclusive model"). The authors evaluate both the MMassDx (descriptor) and MMassDx (inclusive) models using two large publicly available datasets of mammographic mass lesions: the digital database for screening mammography (DDSM) dataset, which contains two subsets from the same examinations-a medio-lateral oblique (MLO) view and cranio-caudal (CC) view dataset-and the mammographic mass (MM) dataset. The DDSM contains 1220 mass lesions and the MM dataset contains 961 mass lesions. The authors evaluate discriminative performance using area under the receiver-operating-characteristic curve (AUC) and compare this to the BI-RADS assessment categories alone (i.e., the clinical performance) using the DeLong method. The authors also evaluate whether assigned probabilistic risk estimates reflect the lesions' true risk of malignancy using calibration curves. The authors demonstrate that the MMassDx algorithms show good discriminatory performance. AUC for the MMassDx (descriptor) model in the DDSM data is 0.876/0.895 (MLO/CC view) and AUC

  2. BUILDING A COMPLETE FREE AND OPEN SOURCE GIS INFRASTRUCTURE FOR HYDROLOGICAL COMPUTING AND DATA PUBLICATION USING GIS.LAB AND GISQUICK PLATFORMS

    Directory of Open Access Journals (Sweden)

    M. Landa

    2017-07-01

    Full Text Available Building a complete free and open source GIS computing and data publication platform can be a relatively easy task. This paper describes an automated deployment of such platform using two open source software projects – GIS.lab and Gisquick. GIS.lab (http: //web.gislab.io is a project for rapid deployment of a complete, centrally managed and horizontally scalable GIS infrastructure in the local area network, data center or cloud. It provides a comprehensive set of free geospatial software seamlessly integrated into one, easy-to-use system. A platform for GIS computing (in our case demonstrated on hydrological data processing requires core components as a geoprocessing server, map server, and a computation engine as eg. GRASS GIS, SAGA, or other similar GIS software. All these components can be rapidly, and automatically deployed by GIS.lab platform. In our demonstrated solution PyWPS is used for serving WPS processes built on the top of GRASS GIS computation platform. GIS.lab can be easily extended by other components running in Docker containers. This approach is shown on Gisquick seamless integration. Gisquick (http://gisquick.org is an open source platform for publishing geospatial data in the sense of rapid sharing of QGIS projects on the web. The platform consists of QGIS plugin, Django-based server application, QGIS server, and web/mobile clients. In this paper is shown how to easily deploy complete open source GIS infrastructure allowing all required operations as data preparation on desktop, data sharing, and geospatial computation as the service. It also includes data publication in the sense of OGC Web Services and importantly also as interactive web mapping applications.

  3. Quality Computer Assisted Mobile Learning (CAML and Distance Education Leadership in Managing Technology Enhanced Learning Management System (TELMS in the Malaysian Public Tertiary Education

    Directory of Open Access Journals (Sweden)

    Lee Tan Luck

    2009-07-01

    Full Text Available Abstract - The success in the implementation of a quality computer assisted mobile learning and distance education in a Technology Enhanced Learning Management System is highly rely on the academic leadership in managing and application of Information and Communication Technology (ICT in the tertiary level. The effectiveness of its leadership, knowledge, application and management of ICT and learning management system is of utmost important. Successful application and management includes quality and cost effectiveness of universities administration, CAML and distance education leadership development, organizational culture, academic staffs and students’ attitude and their commitment towards teaching and learning process, support towards the usage of state of the art techno-educational facilities, availability of ICT resources, maintenance and funding of a Learning Management System. This paper will discuss the above factors, which present a comprehensive framework for the implementation of a quality CAML and distance education environment in ICT application and management in the Malaysian public universities. Selected Fifty-two respondents from two Malaysian public universities which offer e Learning and distance education with Learning Management System were acquired. A survey questionnaire is used to determine the effectiveness of ICT and mobile learning application management. Data from the questionnaires were analyzed by using non-parametric and parametric statistics testing. Results of this study show there is significant different in the CAML and distance education leadership in TELMS and the application of ICT and its management in the Malaysian public universities. The study will also address the implementation elements necessary for transforming the public universities and its CAML and distance education teaching and learning process into an effective and result oriented computer assisted mobile learning management model in public

  4. Ubiquitin domain proteins in disease

    DEFF Research Database (Denmark)

    Klausen, Louise Kjær; Schulze, Andrea; Seeger, Michael

    2007-01-01

    The human genome encodes several ubiquitin-like (UBL) domain proteins (UDPs). Members of this protein family are involved in a variety of cellular functions and many are connected to the ubiquitin proteasome system, an essential pathway for protein degradation in eukaryotic cells. Despite...... and cancer. Publication history: Republished from Current BioData's Targeted Proteins database (TPdb; http://www.targetedproteinsdb.com)....

  5. A protein domain interaction interface database: InterPare

    Directory of Open Access Journals (Sweden)

    Lee Jungsul

    2005-08-01

    Full Text Available Abstract Background Most proteins function by interacting with other molecules. Their interaction interfaces are highly conserved throughout evolution to avoid undesirable interactions that lead to fatal disorders in cells. Rational drug discovery includes computational methods to identify the interaction sites of lead compounds to the target molecules. Identifying and classifying protein interaction interfaces on a large scale can help researchers discover drug targets more efficiently. Description We introduce a large-scale protein domain interaction interface database called InterPare http://interpare.net. It contains both inter-chain (between chains interfaces and intra-chain (within chain interfaces. InterPare uses three methods to detect interfaces: 1 the geometric distance method for checking the distance between atoms that belong to different domains, 2 Accessible Surface Area (ASA, a method for detecting the buried region of a protein that is detached from a solvent when forming multimers or complexes, and 3 the Voronoi diagram, a computational geometry method that uses a mathematical definition of interface regions. InterPare includes visualization tools to display protein interior, surface, and interaction interfaces. It also provides statistics such as the amino acid propensities of queried protein according to its interior, surface, and interface region. The atom coordinates that belong to interface, surface, and interior regions can be downloaded from the website. Conclusion InterPare is an open and public database server for protein interaction interface information. It contains the large-scale interface data for proteins whose 3D-structures are known. As of November 2004, there were 10,583 (Geometric distance, 10,431 (ASA, and 11,010 (Voronoi diagram entries in the Protein Data Bank (PDB containing interfaces, according to the above three methods. In the case of the geometric distance method, there are 31,620 inter-chain domain-domain

  6. Trends of Mobile Learning in Computing Education from 2006 to 2014: A Systematic Review of Research Publications

    Science.gov (United States)

    Anohah, Ebenezer; Oyelere, Solomon Sunday; Suhonen, Jarkko

    2017-01-01

    The majority of the existing research regarding mobile learning in computing education has primarily focused on studying the effectiveness of, and in some cases reporting about, implemented mobile learning solutions. However, it is equally important to explore development and application perspectives on the integration of mobile learning into…

  7. A Fast, Efficient Domain Adaptation Technique for Cross-Domain Electroencephalography(EEG-Based Emotion Recognition

    Directory of Open Access Journals (Sweden)

    Xin Chai

    2017-05-01

    Full Text Available Electroencephalography (EEG-based emotion recognition is an important element in psychiatric health diagnosis for patients. However, the underlying EEG sensor signals are always non-stationary if they are sampled from different experimental sessions or subjects. This results in the deterioration of the classification performance. Domain adaptation methods offer an effective way to reduce the discrepancy of marginal distribution. However, for EEG sensor signals, both marginal and conditional distributions may be mismatched. In addition, the existing domain adaptation strategies always require a high level of additional computation. To address this problem, a novel strategy named adaptive subspace feature matching (ASFM is proposed in this paper in order to integrate both the marginal and conditional distributions within a unified framework (without any labeled samples from target subjects. Specifically, we develop a linear transformation function which matches the marginal distributions of the source and target subspaces without a regularization term. This significantly decreases the time complexity of our domain adaptation procedure. As a result, both marginal and conditional distribution discrepancies between the source domain and unlabeled target domain can be reduced, and logistic regression (LR can be applied to the new source domain in order to train a classifier for use in the target domain, since the aligned source domain follows a distribution which is similar to that of the target domain. We compare our ASFM method with six typical approaches using a public EEG dataset with three affective states: positive, neutral, and negative. Both offline and online evaluations were performed. The subject-to-subject offline experimental results demonstrate that our component achieves a mean accuracy and standard deviation of 80.46% and 6.84%, respectively, as compared with a state-of-the-art method, the subspace alignment auto-encoder (SAAE, which

  8. Editorial: From Public Domain to Islamic Philosophy

    Directory of Open Access Journals (Sweden)

    Editor Al-Jami'ah: Journal of Islamic Studies

    2012-07-01

    Full Text Available The current edition of Al-Jami’ah: Journal of Islamic Studies throws light on various themes related to contemporary development of Islamic religiosity with its multiple contexts. Indeed, Islam has become inspirations for many who uphold its dogma. Muslims, like any other who embrace other faiths, have participated in the creation of various traditions along human history. The author detects the internal development of the PKS in which its activists responded to the gender issues in two ways. The first group remains conservative in line with Islamist ideology. The second group seems to start looking at the issue in a rather progressive way. Whereas the conservative wing displays their ideological stance, the progressive wing exhibits pragmatic step in response to the demand of the current context. After all, this shows ambiguity in the party’s position pertaining to gender issues.

  9. Technique for designing a domain ontology

    OpenAIRE

    Palagin, A. V.; Petrenko, N. G.; Malakhov, K. S.

    2018-01-01

    The article describes the technique for designing a domain ontology, shows the flowchart of algorithm design and example of constructing a fragment of the ontology of the subject area of Computer Science is considered.

  10. Domain similarity based orthology detection.

    Science.gov (United States)

    Bitard-Feildel, Tristan; Kemena, Carsten; Greenwood, Jenny M; Bornberg-Bauer, Erich

    2015-05-13

    Orthologous protein detection software mostly uses pairwise comparisons of amino-acid sequences to assert whether two proteins are orthologous or not. Accordingly, when the number of sequences for comparison increases, the number of comparisons to compute grows in a quadratic order. A current challenge of bioinformatic research, especially when taking into account the increasing number of sequenced organisms available, is to make this ever-growing number of comparisons computationally feasible in a reasonable amount of time. We propose to speed up the detection of orthologous proteins by using strings of domains to characterize the proteins. We present two new protein similarity measures, a cosine and a maximal weight matching score based on domain content similarity, and new software, named porthoDom. The qualities of the cosine and the maximal weight matching similarity measures are compared against curated datasets. The measures show that domain content similarities are able to correctly group proteins into their families. Accordingly, the cosine similarity measure is used inside porthoDom, the wrapper developed for proteinortho. porthoDom makes use of domain content similarity measures to group proteins together before searching for orthologs. By using domains instead of amino acid sequences, the reduction of the search space decreases the computational complexity of an all-against-all sequence comparison. We demonstrate that representing and comparing proteins as strings of discrete domains, i.e. as a concatenation of their unique identifiers, allows a drastic simplification of search space. porthoDom has the advantage of speeding up orthology detection while maintaining a degree of accuracy similar to proteinortho. The implementation of porthoDom is released using python and C++ languages and is available under the GNU GPL licence 3 at http://www.bornberglab.org/pages/porthoda .

  11. Fractional-Fourier-domain weighted Wigner distribution

    NARCIS (Netherlands)

    Stankovic, L.; Alieva, T.; Bastiaans, M.J.

    2001-01-01

    A fractional-Fourier-domain realization of the weighted Wigner distribution (or S-method), producing auto-terms close to the ones in the Wigner distribution itself, but with reduced cross-terms, is presented. The computational cost of this fractional-domain realization is the same as the

  12. Tom Tabor, the owner of Tabor Communications, presents Wolfgang von Rüden with the Editors Choice Award of HPCwire, which was awarded to CERN for its commitment to educating the public about high-performance computing.

    CERN Multimedia

    Maximilien Brice

    2006-01-01

    Tom Tabor, the owner of Tabor Communications, presents Wolfgang von Rüden with the Editors Choice Award of HPCwire, which was awarded to CERN for its commitment to educating the public about high-performance computing.

  13. Application of ubiquitous computing in personal health monitoring systems.

    Science.gov (United States)

    Kunze, C; Grossmann, U; Stork, W; Müller-Glaser, K D

    2002-01-01

    A possibility to significantly reduce the costs of public health systems is to increasingly use information technology. The Laboratory for Information Processing Technology (ITIV) at the University of Karlsruhe is developing a personal health monitoring system, which should improve health care and at the same time reduce costs by combining micro-technological smart sensors with personalized, mobile computing systems. In this paper we present how ubiquitous computing theory can be applied in the health-care domain.

  14. Climiate Resilience Screening Index and Domain Scores

    Data.gov (United States)

    U.S. Environmental Protection Agency — CRSI and related-domain scores for all 50 states and 3135 counties in the U.S. This dataset is not publicly accessible because: They are already available within the...

  15. Dynamics of domain coverage of the protein sequence universe

    Science.gov (United States)

    2012-01-01

    Background The currently known protein sequence space consists of millions of sequences in public databases and is rapidly expanding. Assigning sequences to families leads to a better understanding of protein function and the nature of the protein universe. However, a large portion of the current protein space remains unassigned and is referred to as its “dark matter”. Results Here we suggest that true size of “dark matter” is much larger than stated by current definitions. We propose an approach to reducing the size of “dark matter” by identifying and subtracting regions in protein sequences that are not likely to contain any domain. Conclusions Recent improvements in computational domain modeling result in a decrease, albeit slowly, in the relative size of “dark matter”; however, its absolute size increases substantially with the growth of sequence data. PMID:23157439

  16. Dynamics of domain coverage of the protein sequence universe

    Directory of Open Access Journals (Sweden)

    Rekapalli Bhanu

    2012-11-01

    Full Text Available Abstract Background The currently known protein sequence space consists of millions of sequences in public databases and is rapidly expanding. Assigning sequences to families leads to a better understanding of protein function and the nature of the protein universe. However, a large portion of the current protein space remains unassigned and is referred to as its “dark matter”. Results Here we suggest that true size of “dark matter” is much larger than stated by current definitions. We propose an approach to reducing the size of “dark matter” by identifying and subtracting regions in protein sequences that are not likely to contain any domain. Conclusions Recent improvements in computational domain modeling result in a decrease, albeit slowly, in the relative size of “dark matter”; however, its absolute size increases substantially with the growth of sequence data.

  17. The acoustics of public squares/places: A comparison between results from a computer simulation program and measurements in situ

    DEFF Research Database (Denmark)

    Paini, Dario; Rindel, Jens Holger; Gade, Anders

    2004-01-01

    or a band during, for instance, music summer festivals) and the best position for the audience. A further result could be to propose some acoustic adjustments to achieve better acoustic quality by considering the acoustic parameters which are typically used for concert halls and opera houses.......In the contest of a PhD thesis, in which the main purpose is to analyse the importance of the public square/place (“agora”) as a meeting point of sound and music, with particular regard to its use for concerts (amplified or not), a first step was done, making comparisons between measurement in situ...

  18. A simulation model for visitors’ thermal comfort at urban public squares using non-probabilistic binary-linear classifier through soft-computing methodologies

    International Nuclear Information System (INIS)

    Kariminia, Shahab; Shamshirband, Shahaboddin; Hashim, Roslan; Saberi, Ahmadreza; Petković, Dalibor; Roy, Chandrabhushan; Motamedi, Shervin

    2016-01-01

    Sustaining outdoor life in cities is decreasing because of the recent rapid urbanisation without considering climate-responsive urban design concepts. Such inadvertent climatic modifications at the indoor level have imposed considerable demand on the urban energy resources. It is important to provide comfortable ambient climate at open urban squares. Researchers need to predict the comfortable conditions at such outdoor squares. The main objective of this study is predict the visitors' outdoor comfort indices by using a developed computational model termed as SVM-WAVELET (Support Vector Machines combined with Discrete Wavelet Transform algorithm). For data collection, the field study was conducted in downtown Isfahan, Iran (51°41′ E, 32°37′ N) with hot and arid summers. Based on different environmental elements, four separate locations were monitored across two public squares. Meteorological data were measured simultaneously by surveying the visitors' thermal sensations. According to the subjects' thermal feeling and their characteristics, their level of comfort was estimated. Further, the adapted computational model was used to estimate the visitors’ thermal sensations in terms of thermal comfort indices. The SVM-WAVELET results indicate that R"2 value for input parameters, including Thermal Sensation, PMW (The predicted mean vote), PET (physiologically equivalent temperature), SET (standard effective temperature) and T_m_r_t were estimated at 0.482, 0.943, 0.988, 0.969 and 0.840, respectively. - Highlights: • To explore the visitors' thermal sensation at urban public squares. • This article introduces findings of outdoor comfort prediction. • The developed SVM-WAVELET soft-computing technique was used. • SVM-WAVELET estimation results are more reliable and accurate.

  19. Domain walls at finite temperature

    International Nuclear Information System (INIS)

    Carvalho, C.A. de; Marques, G.C.; Silva, A.J. da; Ventura, I.

    1983-08-01

    It is suggested that the phase transition of lambda phi 4 theory as a function of temperature coincides with the spontaneous appearance of domain walls. Based on one-loop calculations, T sub(c) = 4M/√ lambda is estimated as the temperature for these domains to because energetically favored, to be compared with T sub(c) = 4.9M/√ lambda from effective potential calculations (which are performed directly in the broken phase). Domain walls, as well as other Types of fluctuations, disorder the system above T sub(c), leading to =0. The critical exponent for the specific heat above T sub(c) is computed; and α=2/3 + 0 (√ lambda) is obtained. (Author) [pt

  20. Frequency Domain Image Filtering Using CUDA

    Directory of Open Access Journals (Sweden)

    Muhammad Awais Rajput

    2014-10-01

    Full Text Available In this paper, we investigate the implementation of image filtering in frequency domain using NVIDIA?s CUDA (Compute Unified Device Architecture. In contrast to signal and image filtering in spatial domain which uses convolution operations and hence is more compute-intensive for filters having larger spatial extent, the frequency domain filtering uses FFT (Fast Fourier Transform which is much faster and significantly reduces the computational complexity of the filtering. We implement the frequency domain filtering on CPU and GPU respectively and analyze the speed-up obtained from the CUDA?s parallel processing paradigm. In order to demonstrate the efficiency of frequency domain filtering on CUDA, we implement three frequency domain filters, i.e., Butterworth, low-pass and Gaussian for processing different sizes of images on CPU and GPU respectively and perform the GPU vs. CPU benchmarks. The results presented in this paper show that the frequency domain filtering with CUDA achieves significant speed-up over the CPU processing in frequency domain with the same level of (output image quality on both the processing architectures

  1. Frequency domain image filtering using cuda

    International Nuclear Information System (INIS)

    Rajput, M.A.; Khan, U.A.

    2014-01-01

    In this paper, we investigate the implementation of image filtering in frequency domain using NVIDIA's CUDA (Compute Unified Device Architecture). In contrast to signal and image filtering in spatial domain which uses convolution operations and hence is more compute-intensive for filters having larger spatial extent, the frequency domain filtering uses FFT (Fast Fourier Transform) which is much faster and significantly reduces the computational complexity of the filtering. We implement the frequency domain filtering on CPU and GPU respectively and analyze the speed-up obtained from the CUDA's parallel processing paradigm. In order to demonstrate the efficiency of frequency domain filtering on CUDA, we implement three frequency domain filters, i.e., Butter worth, low-pass and Gaussian for processing different sizes of images on CPU and GPU respectively and perform the GPU vs. CPU benchmarks. The results presented in this paper show that the frequency domain filtering with CUDA achieves significant speed-up over the CPU processing in frequency domain with the same level of (output) image quality on both the processing architectures. (author)

  2. CLOUD COMPUTING SECURITY

    Directory of Open Access Journals (Sweden)

    Ştefan IOVAN

    2016-05-01

    Full Text Available Cloud computing reprentes the software applications offered as a service online, but also the software and hardware components from the data center.In the case of wide offerd services for any type of client, we are dealing with a public cloud. In the other case, in wich a cloud is exclusively available for an organization and is not available to the open public, this is consider a private cloud [1]. There is also a third type, called hibrid in which case an user or an organization might use both services available in the public and private cloud. One of the main challenges of cloud computing are to build the trust and ofer information privacy in every aspect of service offerd by cloud computingle. The variety of existing standards, just like the lack of clarity in sustenability certificationis not a real help in building trust. Also appear some questions marks regarding the efficiency of traditionsecurity means that are applied in the cloud domain. Beside the economic and technology advantages offered by cloud, also are some advantages in security area if the information is migrated to cloud. Shared resources available in cloud includes the survey, use of the "best practices" and technology for advance security level, above all the solutions offered by the majority of medium and small businesses, big companies and even some guvermental organizations [2].

  3. SPEEDI: a computer code system for the real-time prediction of radiation dose to the public due to an accidental release

    International Nuclear Information System (INIS)

    Imai, Kazuhiko; Chino, Masamichi; Ishikawa, Hirohiko

    1985-10-01

    SPEEDI, a computer code system for prediction of environmental doses from radioactive materials accidentally released from a nuclear plant has been developed to assist the organizations responsible for an emergency planning. For realistic simulation, have been developed a model which statistically predicts the basic wind data and then calculates the three-dimensional mass consistent wind field by interpolating these predicted data, and a model for calculation of the diffusion of released materials using a combined model of random-walk and PICK methods. These calculation in the system is carried out in conversational mode with a computer so that we may use the system with ease in an emergency. SPEEDI has also versatile files, which make it easy to control the complicated flows of calculation. In order to attain a short computation time, a large-scale computer with performance of 25 MIPS and a vector processor of maximum 250 MFLOPS are used for calculation of the models so that quick responses have been made. Simplified models are also prepared for calculation in a minicomputer widely used by local governments and research institutes, although the precision of calculation as same with the above models can not be expected to obtain. The present report outlines the structure and functions of SPEEDI, methods for prediction of the wind field and the models for calculation of the concentration of released materials in air and on the ground, and the doses to the public. Some of the diffusion models have been compared with the field experiments which had been carried out as a part of the SPEEDI development program. The report also discusses the reliability of the diffusion models on the basis of the compared results, and shows that they can reasonably simulate the diffusion in the internal boundary layer which commonly occurs near the coastal region. (J.P.N.)

  4. [Analysis and evaluation of the visual effort in remote-control public traffic operators working with computer-based equipments].

    Science.gov (United States)

    Gullà, F; Zambelli, P; Bergamaschi, A; Piccoli, B

    2007-01-01

    The aim of this study is the objective evaluation of the visual effort in 6 public traffic controllers (4 male, 2 female, mean age 29,6), by means of electronic equipment. The electronic equipment quantify the observation distance and the observation time for each controller's occupational visual field. The quantification of these parameters is obtained by the emission of ultrasound at 40 KHz from an emission sensor (placed by the VDT screen) and the ultrasound reception by means of a receiving sensor (placed on the operator's head). The travelling time of the ultrasound (US), as the air speed is known and costant (about 340 m/s), it is used to calculate the distance between the emitting and the receiving sensor. The results show that the visual acuity required is of average level, while accommodation's and convergence's effort vary from average to intense (depending on the visual characteristics of the operator considered), ranging from 26,41 and 43,92% of accommodation and convergence available in each operator. The time actually spent in "near observation within the c.v.p." (Tscr) was maintained in a range from 2h 54' and 4h 05'.

  5. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  6. Finite difference time domain analysis of a chiro plasma

    International Nuclear Information System (INIS)

    Torres-Silva, H.; Obligado, A.; Reggiani, N.; Sakanaka, P.H.

    1995-01-01

    The finite difference time-domain (FDTD) method is one of the most widely used computational methods in electromagnetics. Using FDTD, Maxwell's equations are solved directly in the time domain via finite differences and time stepping. The basic approach is relatively easy to understand and is an alternative to the more usual frequency-domain approaches. (author). 5 refs

  7. A systemic domain model for ambient pervasive persuasive games

    OpenAIRE

    Eglin, Roger; Eyles, Mark; Dansey, Neil

    2008-01-01

    By the development of the system domain model it is hoped that a greater conceptual and theoretical clarity may be brought to understanding the complex and multifaceted nature of pervasive and ambient computer games. This paper presents a conceptual model, the system domain model, to illustrate domain areas that exist in a console, pervasive or ambient game. It is implicit that the regions that the systemic domain model describes are contextually dependent. By developing this model it is poss...

  8. A Human-Centred Tangible approach to learning Computational Thinking

    Directory of Open Access Journals (Sweden)

    Tommaso Turchi

    2016-08-01

    Full Text Available Computational Thinking has recently become a focus of many teaching and research domains; it encapsulates those thinking skills integral to solving complex problems using a computer, thus being widely applicable in our society. It is influencing research across many disciplines and also coming into the limelight of education, mostly thanks to public initiatives such as the Hour of Code. In this paper we present our arguments for promoting Computational Thinking in education through the Human-centred paradigm of Tangible End-User Development, namely by exploiting objects whose interactions with the physical environment are mapped to digital actions performed on the system.

  9. Using context to improve protein domain identification

    Directory of Open Access Journals (Sweden)

    Llinás Manuel

    2011-03-01

    Full Text Available Abstract Background Identifying domains in protein sequences is an important step in protein structural and functional annotation. Existing domain recognition methods typically evaluate each domain prediction independently of the rest. However, the majority of proteins are multidomain, and pairwise domain co-occurrences are highly specific and non-transitive. Results Here, we demonstrate how to exploit domain co-occurrence to boost weak domain predictions that appear in previously observed combinations, while penalizing higher confidence domains if such combinations have never been observed. Our framework, Domain Prediction Using Context (dPUC, incorporates pairwise "context" scores between domains, along with traditional domain scores and thresholds, and improves domain prediction across a variety of organisms from bacteria to protozoa and metazoa. Among the genomes we tested, dPUC is most successful at improving predictions for the poorly-annotated malaria parasite Plasmodium falciparum, for which over 38% of the genome is currently unannotated. Our approach enables high-confidence annotations in this organism and the identification of orthologs to many core machinery proteins conserved in all eukaryotes, including those involved in ribosomal assembly and other RNA processing events, which surprisingly had not been previously known. Conclusions Overall, our results demonstrate that this new context-based approach will provide significant improvements in domain and function prediction, especially for poorly understood genomes for which the need for additional annotations is greatest. Source code for the algorithm is available under a GPL open source license at http://compbio.cs.princeton.edu/dpuc/. Pre-computed results for our test organisms and a web server are also available at that location.

  10. Coping with Computer Viruses: General Discussion and Review of Symantec Anti-Virus for the Macintosh.

    Science.gov (United States)

    Primich, Tracy

    1992-01-01

    Discusses computer viruses that attack the Macintosh and describes Symantec AntiVirus for Macintosh (SAM), a commercial program designed to detect and eliminate viruses; sample screen displays are included. SAM is recommended for use in library settings as well as two public domain virus protection programs. (four references) (MES)

  11. Finding the Secret of Image Saliency in the Frequency Domain.

    Science.gov (United States)

    Li, Jia; Duan, Ling-Yu; Chen, Xiaowu; Huang, Tiejun; Tian, Yonghong

    2015-12-01

    There are two sides to every story of visual saliency modeling in the frequency domain. On the one hand, image saliency can be effectively estimated by applying simple operations to the frequency spectrum. On the other hand, it is still unclear which part of the frequency spectrum contributes the most to popping-out targets and suppressing distractors. Toward this end, this paper tentatively explores the secret of image saliency in the frequency domain. From the results obtained in several qualitative and quantitative experiments, we find that the secret of visual saliency may mainly hide in the phases of intermediate frequencies. To explain this finding, we reinterpret the concept of discrete Fourier transform from the perspective of template-based contrast computation and thus develop several principles for designing the saliency detector in the frequency domain. Following these principles, we propose a novel approach to design the saliency detector under the assistance of prior knowledge obtained through both unsupervised and supervised learning processes. Experimental results on a public image benchmark show that the learned saliency detector outperforms 18 state-of-the-art approaches in predicting human fixations.

  12. Casimir forces in the time domain: Theory

    International Nuclear Information System (INIS)

    Rodriguez, Alejandro W.; McCauley, Alexander P.; Joannopoulos, John D.; Johnson, Steven G.

    2009-01-01

    We present a method to compute Casimir forces in arbitrary geometries and for arbitrary materials based on the finite-difference time-domain (FDTD) scheme. The method involves the time evolution of electric and magnetic fields in response to a set of current sources, in a modified medium with frequency-independent conductivity. The advantage of this approach is that it allows one to exploit existing FDTD software, without modification, to compute Casimir forces. In this paper, we focus on the derivation, implementation choices, and essential properties of the time-domain algorithm, both considered analytically and illustrated in the simplest parallel-plate geometry.

  13. Domain fusion analysis by applying relational algebra to protein sequence and domain databases.

    Science.gov (United States)

    Truong, Kevin; Ikura, Mitsuhiko

    2003-05-06

    Domain fusion analysis is a useful method to predict functionally linked proteins that may be involved in direct protein-protein interactions or in the same metabolic or signaling pathway. As separate domain databases like BLOCKS, PROSITE, Pfam, SMART, PRINTS-S, ProDom, TIGRFAMs, and amalgamated domain databases like InterPro continue to grow in size and quality, a computational method to perform domain fusion analysis that leverages on these efforts will become increasingly powerful. This paper proposes a computational method employing relational algebra to find domain fusions in protein sequence databases. The feasibility of this method was illustrated on the SWISS-PROT+TrEMBL sequence database using domain predictions from the Pfam HMM (hidden Markov model) database. We identified 235 and 189 putative functionally linked protein partners in H. sapiens and S. cerevisiae, respectively. From scientific literature, we were able to confirm many of these functional linkages, while the remainder offer testable experimental hypothesis. Results can be viewed at http://calcium.uhnres.utoronto.ca/pi. As the analysis can be computed quickly on any relational database that supports standard SQL (structured query language), it can be dynamically updated along with the sequence and domain databases, thereby improving the quality of predictions over time.

  14. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    OpenAIRE

    Karlheinz Schwarz; Rainer Breitling; Christian Allen

    2013-01-01

    Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...

  15. Supersymmetric domain walls

    NARCIS (Netherlands)

    Bergshoeff, Eric A.; Kleinschmidt, Axel; Riccioni, Fabio

    2012-01-01

    We classify the half-supersymmetric "domain walls," i.e., branes of codimension one, in toroidally compactified IIA/IIB string theory and show to which gauged supergravity theory each of these domain walls belong. We use as input the requirement of supersymmetric Wess-Zumino terms, the properties of

  16. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  17. Alternative to domain wall fermions

    International Nuclear Information System (INIS)

    Neuberger, H.

    2002-01-01

    An alternative to commonly used domain wall fermions is presented. Some rigorous bounds on the condition number of the associated linear problem are derived. On the basis of these bounds and some experimentation it is argued that domain wall fermions will in general be associated with a condition number that is of the same order of magnitude as the product of the condition number of the linear problem in the physical dimensions by the inverse bare quark mass. Thus, the computational cost of implementing true domain wall fermions using a single conjugate gradient algorithm is of the same order of magnitude as that of implementing the overlap Dirac operator directly using two nested conjugate gradient algorithms. At a cost of about a factor of two in operation count it is possible to make the memory usage of direct implementations of the overlap Dirac operator independent of the accuracy of the approximation to the sign function and of the same order as that of standard Wilson fermions

  18. Knowledge-based public health situation awareness

    Science.gov (United States)

    Mirhaji, Parsa; Zhang, Jiajie; Srinivasan, Arunkumar; Richesson, Rachel L.; Smith, Jack W.

    2004-09-01

    There have been numerous efforts to create comprehensive databases from multiple sources to monitor the dynamics of public health and most specifically to detect the potential threats of bioterrorism before widespread dissemination. But there are not many evidences for the assertion that these systems are timely and dependable, or can reliably identify man made from natural incident. One must evaluate the value of so called 'syndromic surveillance systems' along with the costs involved in design, development, implementation and maintenance of such systems and the costs involved in investigation of the inevitable false alarms1. In this article we will introduce a new perspective to the problem domain with a shift in paradigm from 'surveillance' toward 'awareness'. As we conceptualize a rather different approach to tackle the problem, we will introduce a different methodology in application of information science, computer science, cognitive science and human-computer interaction concepts in design and development of so called 'public health situation awareness systems'. We will share some of our design and implementation concepts for the prototype system that is under development in the Center for Biosecurity and Public Health Informatics Research, in the University of Texas Health Science Center at Houston. The system is based on a knowledgebase containing ontologies with different layers of abstraction, from multiple domains, that provide the context for information integration, knowledge discovery, interactive data mining, information visualization, information sharing and communications. The modular design of the knowledgebase and its knowledge representation formalism enables incremental evolution of the system from a partial system to a comprehensive knowledgebase of 'public health situation awareness' as it acquires new knowledge through interactions with domain experts or automatic discovery of new knowledge.

  19. Computer Vision Syndrome.

    Science.gov (United States)

    Randolph, Susan A

    2017-07-01

    With the increased use of electronic devices with visual displays, computer vision syndrome is becoming a major public health issue. Improving the visual status of workers using computers results in greater productivity in the workplace and improved visual comfort.

  20. Memetic Algorithms, Domain Knowledge, and Financial Investing

    Science.gov (United States)

    Du, Jie

    2012-01-01

    While the question of how to use human knowledge to guide evolutionary search is long-recognized, much remains to be done to answer this question adequately. This dissertation aims to further answer this question by exploring the role of domain knowledge in evolutionary computation as applied to real-world, complex problems, such as financial…

  1. Scalable Domain Decomposed Monte Carlo Particle Transport

    Energy Technology Data Exchange (ETDEWEB)

    O' Brien, Matthew Joseph [Univ. of California, Davis, CA (United States)

    2013-12-05

    In this dissertation, we present the parallel algorithms necessary to run domain decomposed Monte Carlo particle transport on large numbers of processors (millions of processors). Previous algorithms were not scalable, and the parallel overhead became more computationally costly than the numerical simulation.

  2. Publicity and public relations

    Science.gov (United States)

    Fosha, Charles E.

    1990-01-01

    This paper addresses approaches to using publicity and public relations to meet the goals of the NASA Space Grant College. Methods universities and colleges can use to publicize space activities are presented.

  3. Conserved Domain Database (CDD)

    Data.gov (United States)

    U.S. Department of Health & Human Services — CDD is a protein annotation resource that consists of a collection of well-annotated multiple sequence alignment models for ancient domains and full-length proteins.

  4. Towards Domain-specific Flow-based Languages

    DEFF Research Database (Denmark)

    Zarrin, Bahram; Baumeister, Hubert; Sarjoughian, Hessam S.

    2018-01-01

    describe their problems and solutions, instead of using general purpose programming languages. The goal of these languages is to improve the productivity and efficiency of the development and simulation of concurrent scientific models and systems. Moreover, they help to expose parallelism and to specify...... the concurrency within a component or across different independent components. In this paper, we introduce the concept of domain-specific flowbased languages which allows domain experts to use flow-based languages adapted to a particular problem domain. Flow-based programming is used to support concurrency, while......Due to the significant growth of the demand for data-intensive computing, in addition to the emergence of new parallel and distributed computing technologies, scientists and domain experts are leveraging languages specialized for their problem domain, i.e., domain-specific languages, to help them...

  5. Análisis de dominio de la revista mexicana Investigación Bibliotecológica = Domain Analysis by the Mexican Publication Investigación Bibliotecológica

    Directory of Open Access Journals (Sweden)

    Félix de Moya-Anegón

    2001-12-01

    Full Text Available El análisis de dominio esta compuesto por un conjunto de metodologías que permiten delinear la estructura de relaciones existente en una determinada disciplina. El objetivo del presente trabajo es brindar un análisis de dominio de la disciplina Bibliotecología y Documentación (ByD en México. Para ello se analizará la bibliografía citaa por la revista Investigación Bibliotecológica (IB. Entre los elementos a analizar se encuentran: producción, autoría, coautoría, fuentes citadas y cocitación de revistas = The domain analysis involved a set of methods that show the structure of relations in a specific discipline. The aim of this paper is to provide the domain analysis of Library and Information Science (LIS in México. For that purpose, the bibliography of the Investigación Bibliotecológica (IB journal is analized. We analize the production, autorship, co-autorship, quoted sources, and joint quotation of journals.

  6. Análisis de dominio de la revista mexicana Investigación Bibliotecológica Domain analysis by the mexican publication Investigación Bibliotecológica

    Directory of Open Access Journals (Sweden)

    Félix de Moya-Anegón

    2001-12-01

    Full Text Available El análisis de dominio esta compuesto por un conjunto de metodologías que permiten delinear la estructura de relaciones existente en una determinada disciplina. El objetivo del presente trabajo es brindar un análisis de dominio de la disciplina Bibliotecología y Documentación (ByD en México. Para ello se analizará la bibliografía citaa por la revista Investigación Bibliotecológica (IB. Entre los elementos a analizar se encuentran: producción, autoría, coautoría, fuentes citadas y cocitación de revistas.The domain analysis involved a set of methods that show the structure of relations in a specific discipline. The aim of this paper is to provide the domain analysis of Library and Information Science (LIS in México. For that purpose, the bibliography of the Investigación Bibliotecológica (IB journal is analized. We analize the production, autorship, co-autorship, quoted sources, and joint quotation of journals.

  7. A thermodynamic definition of protein domains.

    Science.gov (United States)

    Porter, Lauren L; Rose, George D

    2012-06-12

    Protein domains are conspicuous structural units in globular proteins, and their identification has been a topic of intense biochemical interest dating back to the earliest crystal structures. Numerous disparate domain identification algorithms have been proposed, all involving some combination of visual intuition and/or structure-based decomposition. Instead, we present a rigorous, thermodynamically-based approach that redefines domains as cooperative chain segments. In greater detail, most small proteins fold with high cooperativity, meaning that the equilibrium population is dominated by completely folded and completely unfolded molecules, with a negligible subpopulation of partially folded intermediates. Here, we redefine structural domains in thermodynamic terms as cooperative folding units, based on m-values, which measure the cooperativity of a protein or its substructures. In our analysis, a domain is equated to a contiguous segment of the folded protein whose m-value is largely unaffected when that segment is excised from its parent structure. Defined in this way, a domain is a self-contained cooperative unit; i.e., its cooperativity depends primarily upon intrasegment interactions, not intersegment interactions. Implementing this concept computationally, the domains in a large representative set of proteins were identified; all exhibit consistency with experimental findings. Specifically, our domain divisions correspond to the experimentally determined equilibrium folding intermediates in a set of nine proteins. The approach was also proofed against a representative set of 71 additional proteins, again with confirmatory results. Our reframed interpretation of a protein domain transforms an indeterminate structural phenomenon into a quantifiable molecular property grounded in solution thermodynamics.

  8. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  9. Experience with an on-line computer for controlling and optimizing the gas supply and the application of various peak-load supply plants of a public utility

    Energy Technology Data Exchange (ETDEWEB)

    Poll, J [Technische Werke der Stadt Stuttgart A.G. (Germany, F.R.)

    1977-02-01

    The computer system has the following tasks: 1) On-line control; 2) supply of an information system; 3) performance of a gas marketing forecast; 4) background computations. Measured data are compiled, processed, monitored, recorded, prepared, and stored. The process is controlled by about a dozen programmes, the remaining tasks are taken over by 22 programmes. The system has proved a success.

  10. Construction and Design Kits: Human Problem-Domain Communication

    National Research Council Canada - National Science Library

    Fischer, Gerhard; Lemke, Andreas C

    1987-01-01

    .... To provide the user with the appropriate level of control and a better understanding, we have to replace human-computer communication with human problem-domain communication, which allows users...

  11. A time domain phase-gradient based ISAR autofocus algorithm

    CSIR Research Space (South Africa)

    Nel, W

    2011-10-01

    Full Text Available . Results on simulated and measured data show that the algorithm performs well. Unlike many other ISAR autofocus techniques, the algorithm does not make use of several computationally intensive iterations between the data and image domains as part...

  12. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  13. Cloud computing for radiologists

    OpenAIRE

    Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...

  14. Knowing, Applying, and Reasoning about Arithmetic: Roles of Domain-General and Numerical Skills in Multiple Domains of Arithmetic Learning

    Science.gov (United States)

    Zhang, Xiao; Räsänen, Pekka; Koponen, Tuire; Aunola, Kaisa; Lerkkanen, Marja-Kristiina; Nurmi, Jari-Erik

    2017-01-01

    The longitudinal relations of domain-general and numerical skills at ages 6-7 years to 3 cognitive domains of arithmetic learning, namely knowing (written computation), applying (arithmetic word problems), and reasoning (arithmetic reasoning) at age 11, were examined for a representative sample of 378 Finnish children. The results showed that…

  15. Bregmanized Domain Decomposition for Image Restoration

    KAUST Repository

    Langer, Andreas

    2012-05-22

    Computational problems of large-scale data are gaining attention recently due to better hardware and hence, higher dimensionality of images and data sets acquired in applications. In the last couple of years non-smooth minimization problems such as total variation minimization became increasingly important for the solution of these tasks. While being favorable due to the improved enhancement of images compared to smooth imaging approaches, non-smooth minimization problems typically scale badly with the dimension of the data. Hence, for large imaging problems solved by total variation minimization domain decomposition algorithms have been proposed, aiming to split one large problem into N > 1 smaller problems which can be solved on parallel CPUs. The N subproblems constitute constrained minimization problems, where the constraint enforces the support of the minimizer to be the respective subdomain. In this paper we discuss a fast computational algorithm to solve domain decomposition for total variation minimization. In particular, we accelerate the computation of the subproblems by nested Bregman iterations. We propose a Bregmanized Operator Splitting-Split Bregman (BOS-SB) algorithm, which enforces the restriction onto the respective subdomain by a Bregman iteration that is subsequently solved by a Split Bregman strategy. The computational performance of this new approach is discussed for its application to image inpainting and image deblurring. It turns out that the proposed new solution technique is up to three times faster than the iterative algorithm currently used in domain decomposition methods for total variation minimization. © Springer Science+Business Media, LLC 2012.

  16. A scoping review of cloud computing in healthcare.

    Science.gov (United States)

    Griebel, Lena; Prokosch, Hans-Ulrich; Köpcke, Felix; Toddenroth, Dennis; Christoph, Jan; Leb, Ines; Engel, Igor; Sedlmayr, Martin

    2015-03-19

    Cloud computing is a recent and fast growing area of development in healthcare. Ubiquitous, on-demand access to virtually endless resources in combination with a pay-per-use model allow for new ways of developing, delivering and using services. Cloud computing is often used in an "OMICS-context", e.g. for computing in genomics, proteomics and molecular medicine, while other field of application still seem to be underrepresented. Thus, the objective of this scoping review was to identify the current state and hot topics in research on cloud computing in healthcare beyond this traditional domain. MEDLINE was searched in July 2013 and in December 2014 for publications containing the terms "cloud computing" and "cloud-based". Each journal and conference article was categorized and summarized independently by two researchers who consolidated their findings. 102 publications have been analyzed and 6 main topics have been found: telemedicine/teleconsultation, medical imaging, public health and patient self-management, hospital management and information systems, therapy, and secondary use of data. Commonly used features are broad network access for sharing and accessing data and rapid elasticity to dynamically adapt to computing demands. Eight articles favor the pay-for-use characteristics of cloud-based services avoiding upfront investments. Nevertheless, while 22 articles present very general potentials of cloud computing in the medical domain and 66 articles describe conceptual or prototypic projects, only 14 articles report from successful implementations. Further, in many articles cloud computing is seen as an analogy to internet-/web-based data sharing and the characteristics of the particular cloud computing approach are unfortunately not really illustrated. Even though cloud computing in healthcare is of growing interest only few successful implementations yet exist and many papers just use the term "cloud" synonymously for "using virtual machines" or "web

  17. Domain: Labour market

    NARCIS (Netherlands)

    Oude Mulders, J.; Wadensjö, E.; Hasselhorn, H.M.; Apt, W.

    This domain chapter is dedicated to summarize research on the effects of labour market contextual factors on labour market participation of older workers (aged 50+) and identify research gaps. While employment participation and the timing of (early) retirement is often modelled as an individual

  18. Cellulose binding domain proteins

    Science.gov (United States)

    Shoseyov, Oded; Shpiegl, Itai; Goldstein, Marc; Doi, Roy

    1998-01-01

    A cellulose binding domain (CBD) having a high affinity for crystalline cellulose and chitin is disclosed, along with methods for the molecular cloning and recombinant production thereof. Fusion products comprising the CBD and a second protein are likewise described. A wide range of applications are contemplated for both the CBD and the fusion products, including drug delivery, affinity separations, and diagnostic techniques.

  19. Domain-Specific Multimodeling

    DEFF Research Database (Denmark)

    Hessellund, Anders

    the overall level of abstraction. It does, however, also introduce a new problem of coordinating multiple different languages in a single system. We call this problem the coordination problem. In this thesis, we present the coordination method for domain-specific multimodeling that explicitly targets...

  20. GlycoDomainViewer

    DEFF Research Database (Denmark)

    Joshi, Hiren J; Jørgensen, Anja; Schjoldager, Katrine T

    2018-01-01

    features, which enhances visibility and accessibility of the wealth of glycoproteomic data being generated. The GlycoDomainViewer enables visual exploration of glycoproteomic data, incorporating information from recent N- and O-glycoproteome studies on human and animal cell lines and some organs and body...

  1. Assessing the economic impact of public investment in Malaysia: a case study on MyRapid Transit project using a dynamic computable general equilibrium model

    OpenAIRE

    Muniandy, Meenachi

    2017-01-01

    The central focus of this thesis is the question of whether public investment in transport infrastructure contributes positively to Malaysia’s economic growth and welfare. Although there are strong analytical reasons to believe that public investment spending is one of the important variables that influence growth, there remains significant uncertainty about its actual degree of influence. In Malaysia, whenever there is a collapse in domestic demand, government spending becomes an important m...

  2. The framing of scientific domains

    DEFF Research Database (Denmark)

    Dam Christensen, Hans

    2014-01-01

    domains, and UNISIST helps understanding this navigation. Design/methodology/approach The UNISIST models are tentatively applied to the domain of art history at three stages, respectively two modern, partially overlapping domains, as well as an outline of an art historical domain anno c1820...

  3. Multilevel domain decomposition for electronic structure calculations

    International Nuclear Information System (INIS)

    Barrault, M.; Cances, E.; Hager, W.W.; Le Bris, C.

    2007-01-01

    We introduce a new multilevel domain decomposition method (MDD) for electronic structure calculations within semi-empirical and density functional theory (DFT) frameworks. This method iterates between local fine solvers and global coarse solvers, in the spirit of domain decomposition methods. Using this approach, calculations have been successfully performed on several linear polymer chains containing up to 40,000 atoms and 200,000 atomic orbitals. Both the computational cost and the memory requirement scale linearly with the number of atoms. Additional speed-up can easily be obtained by parallelization. We show that this domain decomposition method outperforms the density matrix minimization (DMM) method for poor initial guesses. Our method provides an efficient preconditioner for DMM and other linear scaling methods, variational in nature, such as the orbital minimization (OM) procedure

  4. Conduction at domain walls in oxide multiferroics

    Science.gov (United States)

    Seidel, J.; Martin, L. W.; He, Q.; Zhan, Q.; Chu, Y.-H.; Rother, A.; Hawkridge, M. E.; Maksymovych, P.; Yu, P.; Gajek, M.; Balke, N.; Kalinin, S. V.; Gemming, S.; Wang, F.; Catalan, G.; Scott, J. F.; Spaldin, N. A.; Orenstein, J.; Ramesh, R.

    2009-03-01

    Domain walls may play an important role in future electronic devices, given their small size as well as the fact that their location can be controlled. Here, we report the observation of room-temperature electronic conductivity at ferroelectric domain walls in the insulating multiferroic BiFeO3. The origin and nature of the observed conductivity are probed using a combination of conductive atomic force microscopy, high-resolution transmission electron microscopy and first-principles density functional computations. Our analyses indicate that the conductivity correlates with structurally driven changes in both the electrostatic potential and the local electronic structure, which shows a decrease in the bandgap at the domain wall. Additionally, we demonstrate the potential for device applications of such conducting nanoscale features.

  5. Predicting detection performance with model observers: Fourier domain or spatial domain?

    Science.gov (United States)

    Chen, Baiyu; Yu, Lifeng; Leng, Shuai; Kofler, James; Favazza, Christopher; Vrieze, Thomas; McCollough, Cynthia

    2016-02-27

    The use of Fourier domain model observer is challenged by iterative reconstruction (IR), because IR algorithms are nonlinear and IR images have noise texture different from that of FBP. A modified Fourier domain model observer, which incorporates nonlinear noise and resolution properties, has been proposed for IR and needs to be validated with human detection performance. On the other hand, the spatial domain model observer is theoretically applicable to IR, but more computationally intensive than the Fourier domain method. The purpose of this study is to compare the modified Fourier domain model observer to the spatial domain model observer with both FBP and IR images, using human detection performance as the gold standard. A phantom with inserts of various low contrast levels and sizes was repeatedly scanned 100 times on a third-generation, dual-source CT scanner at 5 dose levels and reconstructed using FBP and IR algorithms. The human detection performance of the inserts was measured via a 2-alternative-forced-choice (2AFC) test. In addition, two model observer performances were calculated, including a Fourier domain non-prewhitening model observer and a spatial domain channelized Hotelling observer. The performance of these two mode observers was compared in terms of how well they correlated with human observer performance. Our results demonstrated that the spatial domain model observer correlated well with human observers across various dose levels, object contrast levels, and object sizes. The Fourier domain observer correlated well with human observers using FBP images, but overestimated the detection performance using IR images.

  6. Direct time-domain techniques for transient radiation and scattering

    International Nuclear Information System (INIS)

    Miller, E.K.; Landt, J.A.

    1976-01-01

    A tutorial introduction to transient electromagnetics, focusing on direct time-domain techniques, is presented. Physical, mathematical, numerical, and experimental aspects of time-domain methods, with emphasis on wire objects excited as antennas or scatters are examined. Numerous computed examples illustrate the characteristics of direct time-domain procedures, especially where they may offer advantages over procedures in the more familiar frequency domain. These advantages include greater solution efficiency for many types of problems, the ability to handle nonlinearities, improved physical insight and interpretability, availability of wide-band information from a single calculation, and the possibility of isolating interactions among various parts of an object using time-range gating

  7. Volunteer Computing for Science Gateways

    OpenAIRE

    Anderson, David

    2017-01-01

    This poster offers information about volunteer computing for science gateways that offer high-throughput computing services. Volunteer computing can be used to get computing power. This increases the visibility of the gateway to the general public as well as increasing computing capacity at little cost.

  8. Identifying the impact of G-quadruplexes on Affymetrix 3' arrays using cloud computing.

    Science.gov (United States)

    Memon, Farhat N; Owen, Anne M; Sanchez-Graillet, Olivia; Upton, Graham J G; Harrison, Andrew P

    2010-01-15

    A tetramer quadruplex structure is formed by four parallel strands of DNA/ RNA containing runs of guanine. These quadruplexes are able to form because guanine can Hoogsteen hydrogen bond to other guanines, and a tetrad of guanines can form a stable arrangement. Recently we have discovered that probes on Affymetrix GeneChips that contain runs of guanine do not measure gene expression reliably. We associate this finding with the likelihood that quadruplexes are forming on the surface of GeneChips. In order to cope with the rapidly expanding size of GeneChip array datasets in the public domain, we are exploring the use of cloud computing to replicate our experiments on 3' arrays to look at the effect of the location of G-spots (runs of guanines). Cloud computing is a recently introduced high-performance solution that takes advantage of the computational infrastructure of large organisations such as Amazon and Google. We expect that cloud computing will become widely adopted because it enables bioinformaticians to avoid capital expenditure on expensive computing resources and to only pay a cloud computing provider for what is used. Moreover, as well as financial efficiency, cloud computing is an ecologically-friendly technology, it enables efficient data-sharing and we expect it to be faster for development purposes. Here we propose the advantageous use of cloud computing to perform a large data-mining analysis of public domain 3' arrays.

  9. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  10. Wavefield Extrapolation in Pseudo-depth Domain

    KAUST Repository

    Ma, Xuxin

    2011-12-11

    Wave-equation based seismic migration and inversion tools are widely used by the energy industry to explore hydrocarbon and mineral resources. By design, most of these techniques simulate wave propagation in a space domain with the vertical axis being depth measured from the surface. Vertical depth is popular because it is a straightforward mapping of the subsurface space. It is, however, not computationally cost-effective because the wavelength changes with local elastic wave velocity, which in general increases with depth in the Earth. As a result, the sampling per wavelength also increases with depth. To avoid spatial aliasing in deep fast media, the seismic wave is oversampled in shallow slow media and therefore increase the total computation cost. This issue is effectively tackled by using the vertical time axis instead of vertical depth. This is because in a vertical time representation, the "wavelength" is essentially time period for vertical rays. This thesis extends the vertical time axis to the pseudo-depth axis, which features distance unit while preserving the properties of the vertical time representation. To explore the potentials of doing wave-equation based imaging in the pseudo-depth domain, a Partial Differential Equation (PDE) is derived to describe acoustic wave in this new domain. This new PDE is inherently anisotropic because the use of a constant vertical velocity to convert between depth and vertical time. Such anisotropy results in lower reflection coefficients compared with conventional space domain modeling results. This feature is helpful to suppress the low wavenumber artifacts in reverse-time migration images, which are caused by the widely used cross-correlation imaging condition. This thesis illustrates modeling acoustic waves in both conventional space domain and pseudo-depth domain. The numerical tool used to model acoustic waves is built based on the lowrank approximation of Fourier integral operators. To investigate the potential

  11. Research on cloud computing solutions

    OpenAIRE

    Liudvikas Kaklauskas; Vaida Zdanytė

    2015-01-01

    Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, ...

  12. Innovative User Interfaces in the Industrial Domain

    OpenAIRE

    Jutterström, Jenny

    2010-01-01

    The goal of this thesis is to explore how the HMI of a process control system can be improved by applying modern interaction technologies. Many new interaction possibilities are arising on the market, while the interaction in the industrial domain still is quite conservative, with computer mouse and keyboard as the central method of interaction. It is believed that by making use of technology available today, the user interface can provide further assistance to the process control operators a...

  13. DATA TRANSFER FROM A DEC PDP-11 BASED MASS-SPECTROMETRY DATA STATION TO AN MS-DOS PERSONAL-COMPUTER

    NARCIS (Netherlands)

    RAFFAELLI, A; BRUINS, AP

    This paper describes a simple procedure for obtaining better quality graphic output for mass spectrometry data from data systems equipped with poor quality printing devices. The procedure uses KERMIT, a low cost public domain software, to transfer ASCII tables to a MS-DOS personal computer where

  14. TENCompetence Domain Model

    NARCIS (Netherlands)

    2006-01-01

    This is the version 1.1 of the TENCompetence Domain Model (version 1.0 released at 19-6-2006; version 1.1 at 9-11-2008). It contains several files: a) a pdf with the model description, b) three jpg files with class models (also in the pdf), c) a MagicDraw zip file with the model itself, d) a release

  15. SH2 Domain Histochemistry.

    Science.gov (United States)

    Buhs, Sophia; Nollau, Peter

    2017-01-01

    Among posttranslational modifications, the phosphorylation of tyrosine residues is a key modification in cell signaling. Because of its biological importance, characterization of the cellular state of tyrosine phosphorylation is of great interest. Based on the unique properties of endogenously expressed SH2 domains recognizing tyrosine phosphorylated signaling proteins with high specificity we have developed an alternative approach, coined SH2 profiling, enabling us to decipher complex patterns of tyrosine phosphorylation in various normal and cancerous tissues. So far, SH2 profiling has largely been applied for the analysis of protein extracts with the limitation that information on spatial distribution and intensity of tyrosine phosphorylation within a tissue is lost. Here, we describe a novel SH2 domain based strategy for differential characterization of the state of tyrosine phosphorylation in formaldehyde-fixed and paraffin-embedded tissues. This approach demonstrates that SH2 domains may serve as very valuable tools for the analysis of the differential state of tyrosine phosphorylation in primary tissues fixed and processed under conditions frequently applied by routine pathology laboratories.

  16. 2017 Emerging Technology Domains Risk Survey

    Science.gov (United States)

    2017-10-01

    REV-03.18.2016.0 2017 Emerging Technology Domains Risk Survey Daniel Klinedinst Joel Land Kyle O’Meara October 2017 TECHNICAL REPORT CMU/SEI...Distribution Statement A: Approved for Public Release. Distribution is Unlimited. List of Tables Table 1: New and Emerging Technologies 2 Table 2: Security...Impact of New and Emerging Technologies 4 Table 3: Severity Classifications and Impact Scores 5 CMU/SEI-2017-TR-008 | SOFTWARE ENGINEERING

  17. 2016 Emerging Technology Domains Risk Survey

    Science.gov (United States)

    2016-04-05

    measures upon which the CERT/CC based its recommendations and how each domain was triaged for importance. 6. Exploitation Examples details concepts or...Distribution Statement A: Approved for Public Release; Distribution is Unlimited 2 Methodology A measured approach to analysis is required when...only a few vehicles had access to a cellular Internet connection, and only at 3G speeds. Some vehicles already have LTE connections, and many

  18. Upgrade Software and Computing

    CERN Document Server

    The LHCb Collaboration, CERN

    2018-01-01

    This document reports the Research and Development activities that are carried out in the software and computing domains in view of the upgrade of the LHCb experiment. The implementation of a full software trigger implies major changes in the core software framework, in the event data model, and in the reconstruction algorithms. The increase of the data volumes for both real and simulated datasets requires a corresponding scaling of the distributed computing infrastructure. An implementation plan in both domains is presented, together with a risk assessment analysis.

  19. Evaluation of the TSC Dolphin Computer Assisted Instructional System in the Chapter 1 Program of the District of Columbia Public Schools. Final Report 85-9.

    Science.gov (United States)

    Harris, Carolyn DeMeyer; And Others

    Dolphin is a computer-assisted instruction system used to teach and reinforce skills in reading, language arts, and mathematics. An evaluation of this system was conducted to provide information to TSC Division of Houghton Mifflin regarding its effectiveness and possible modifications to the system. The general design of the evaluation was to…

  20. Bulletin bibliographique sur l'E.A.O. (l'enseignement assiste par ordinateur) (Bibliographic Bulletin on Computer Assisted Instruction). Publication K-4.

    Science.gov (United States)

    LaForge, Lorne, Ed.

    The bibliography contains about 150 citations of journal articles, monographs, collected works, research reports, and essays drawn from the BIBELO database and concerning computer-assisted language instruction. The first half of the volume is an annotated bibliography in alphabetical order by author. The second section contains subject and…

  1. The DIMA web resource--exploring the protein domain network.

    Science.gov (United States)

    Pagel, Philipp; Oesterheld, Matthias; Stümpflen, Volker; Frishman, Dmitrij

    2006-04-15

    Conserved domains represent essential building blocks of most known proteins. Owing to their role as modular components carrying out specific functions they form a network based both on functional relations and direct physical interactions. We have previously shown that domain interaction networks provide substantially novel information with respect to networks built on full-length protein chains. In this work we present a comprehensive web resource for exploring the Domain Interaction MAp (DIMA), interactively. The tool aims at integration of multiple data sources and prediction techniques, two of which have been implemented so far: domain phylogenetic profiling and experimentally demonstrated domain contacts from known three-dimensional structures. A powerful yet simple user interface enables the user to compute, visualize, navigate and download domain networks based on specific search criteria. http://mips.gsf.de/genre/proj/dima

  2. Domain Adaptation for Pedestrian Detection Based on Prediction Consistency

    Directory of Open Access Journals (Sweden)

    Yu Li-ping

    2014-01-01

    Full Text Available Pedestrian detection is an active area of research in computer vision. It remains a quite challenging problem in many applications where many factors cause a mismatch between source dataset used to train the pedestrian detector and samples in the target scene. In this paper, we propose a novel domain adaptation model for merging plentiful source domain samples with scared target domain samples to create a scene-specific pedestrian detector that performs as well as rich target domain simples are present. Our approach combines the boosting-based learning algorithm with an entropy-based transferability, which is derived from the prediction consistency with the source classifications, to selectively choose the samples showing positive transferability in source domains to the target domain. Experimental results show that our approach can improve the detection rate, especially with the insufficient labeled data in target scene.

  3. Computer vision and machine learning for archaeology

    NARCIS (Netherlands)

    van der Maaten, L.J.P.; Boon, P.; Lange, G.; Paijmans, J.J.; Postma, E.

    2006-01-01

    Until now, computer vision and machine learning techniques barely contributed to the archaeological domain. The use of these techniques can support archaeologists in their assessment and classification of archaeological finds. The paper illustrates the use of computer vision techniques for

  4. RISC Processors and High Performance Computing

    Science.gov (United States)

    Bailey, David H.; Saini, Subhash; Craw, James M. (Technical Monitor)

    1995-01-01

    This tutorial will discuss the top five RISC microprocessors and the parallel systems in which they are used. It will provide a unique cross-machine comparison not available elsewhere. The effective performance of these processors will be compared by citing standard benchmarks in the context of real applications. The latest NAS Parallel Benchmarks, both absolute performance and performance per dollar, will be listed. The next generation of the NPB will be described. The tutorial will conclude with a discussion of future directions in the field. Technology Transfer Considerations: All of these computer systems are commercially available internationally. Information about these processors is available in the public domain, mostly from the vendors themselves. The NAS Parallel Benchmarks and their results have been previously approved numerous times for public release, beginning back in 1991.

  5. Foundations of Neuromorphic Computing

    Science.gov (United States)

    2013-05-01

    paradigms: few sensors/complex computations and many sensors/simple computation. Challenges with Nano-enabled Neuromorphic Chips A wide variety of...FOUNDATIONS OF NEUROMORPHIC COMPUTING MAY 2013 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...2009 – SEP 2012 4. TITLE AND SUBTITLE FOUNDATIONS OF NEUROMORPHIC COMPUTING 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER N/A 5c. PROGRAM

  6. Domain decomposition methods for the neutron diffusion problem

    International Nuclear Information System (INIS)

    Guerin, P.; Baudron, A. M.; Lautard, J. J.

    2010-01-01

    The neutronic simulation of a nuclear reactor core is performed using the neutron transport equation, and leads to an eigenvalue problem in the steady-state case. Among the deterministic resolution methods, simplified transport (SPN) or diffusion approximations are often used. The MINOS solver developed at CEA Saclay uses a mixed dual finite element method for the resolution of these problems. and has shown his efficiency. In order to take into account the heterogeneities of the geometry, a very fine mesh is generally required, and leads to expensive calculations for industrial applications. In order to take advantage of parallel computers, and to reduce the computing time and the local memory requirement, we propose here two domain decomposition methods based on the MINOS solver. The first approach is a component mode synthesis method on overlapping sub-domains: several Eigenmodes solutions of a local problem on each sub-domain are taken as basis functions used for the resolution of the global problem on the whole domain. The second approach is an iterative method based on a non-overlapping domain decomposition with Robin interface conditions. At each iteration, we solve the problem on each sub-domain with the interface conditions given by the solutions on the adjacent sub-domains estimated at the previous iteration. Numerical results on parallel computers are presented for the diffusion model on realistic 2D and 3D cores. (authors)

  7. A Cross-Domain Collaborative Filtering Algorithm Based on Feature Construction and Locally Weighted Linear Regression.

    Science.gov (United States)

    Yu, Xu; Lin, Jun-Yu; Jiang, Feng; Du, Jun-Wei; Han, Ji-Zhong

    2018-01-01

    Cross-domain collaborative filtering (CDCF) solves the sparsity problem by transferring rating knowledge from auxiliary domains. Obviously, different auxiliary domains have different importance to the target domain. However, previous works cannot evaluate effectively the significance of different auxiliary domains. To overcome this drawback, we propose a cross-domain collaborative filtering algorithm based on Feature Construction and Locally Weighted Linear Regression (FCLWLR). We first construct features in different domains and use these features to represent different auxiliary domains. Thus the weight computation across different domains can be converted as the weight computation across different features. Then we combine the features in the target domain and in the auxiliary domains together and convert the cross-domain recommendation problem into a regression problem. Finally, we employ a Locally Weighted Linear Regression (LWLR) model to solve the regression problem. As LWLR is a nonparametric regression method, it can effectively avoid underfitting or overfitting problem occurring in parametric regression methods. We conduct extensive experiments to show that the proposed FCLWLR algorithm is effective in addressing the data sparsity problem by transferring the useful knowledge from the auxiliary domains, as compared to many state-of-the-art single-domain or cross-domain CF methods.

  8. A Cross-Domain Collaborative Filtering Algorithm Based on Feature Construction and Locally Weighted Linear Regression

    Directory of Open Access Journals (Sweden)

    Xu Yu

    2018-01-01

    Full Text Available Cross-domain collaborative filtering (CDCF solves the sparsity problem by transferring rating knowledge from auxiliary domains. Obviously, different auxiliary domains have different importance to the target domain. However, previous works cannot evaluate effectively the significance of different auxiliary domains. To overcome this drawback, we propose a cross-domain collaborative filtering algorithm based on Feature Construction and Locally Weighted Linear Regression (FCLWLR. We first construct features in different domains and use these features to represent different auxiliary domains. Thus the weight computation across different domains can be converted as the weight computation across different features. Then we combine the features in the target domain and in the auxiliary domains together and convert the cross-domain recommendation problem into a regression problem. Finally, we employ a Locally Weighted Linear Regression (LWLR model to solve the regression problem. As LWLR is a nonparametric regression method, it can effectively avoid underfitting or overfitting problem occurring in parametric regression methods. We conduct extensive experiments to show that the proposed FCLWLR algorithm is effective in addressing the data sparsity problem by transferring the useful knowledge from the auxiliary domains, as compared to many state-of-the-art single-domain or cross-domain CF methods.

  9. Functional Domain Driven Design

    OpenAIRE

    Herrera Guzmán, Sergio

    2016-01-01

    Las tecnologías están en constante expansión y evolución, diseñando nuevas técnicas para cumplir con su fin. En el desarrollo de software, las herramientas y pautas para la elaboración de productos software constituyen una pieza en constante evolución, necesarias para la toma de decisiones sobre los proyectos a realizar. Uno de los arquetipos para el desarrollo de software es el denominado Domain Driven Design, donde es importante conocer ampliamente el negocio que se desea modelar en form...

  10. Public Library Training Program for Older Adults Addresses Their Computer and Health Literacy Needs. A Review of: Xie, B. (2011. Improving older adults’ e-health literacy through computer training using NIH online resources. Library & Information Science Research, 34, 63-71. doi: /10.1016/j.lisr.2011.07.006

    Directory of Open Access Journals (Sweden)

    Cari Merkley

    2012-12-01

    Full Text Available Objective – To evaluate the efficacy of an ehealthliteracy educational intervention aimedat older adults.Design – Pre and post interventionquestionnaires administered in anexperimental study.Setting – Two public library branches inMaryland.Subjects – 218 adults between 60 and 89 yearsof age.Methods – A convenience sample of olderadults was recruited to participate in a fourweek training program structured around theNational Institutes of Health toolkit HelpingOlder Adults Search for Health InformationOnline. During the program, classes met at theparticipating libraries twice a week. Sessionswere two hours in length, and employedhands on exercises led by Master of LibraryScience students. The training included anintroduction to the Internet, as well as in depthtraining in the use of the NIHSeniorHealth andMedlinePlus websites. In the first class,participants were asked to complete a pretrainingquestionnaire that included questionsrelating to demographics and previouscomputer and Internet experience, as well asmeasures from the Computer Anxiety Scaleand two subscales of the Attitudes towardComputers Questionnaire. Participantsbetween September 2008 and June 2009 alsocompleted pre-training computer and web knowledge tests that asked individuals to label the parts of a computer and of a website using a provided list of terms. At the end of the program, participants were asked to complete post-training questionnaires that included the previously employed questions from the Computer Anxiety Scale and Attitudes towards Computer Questionnaire. New questions were added relating to the participants’ satisfaction with the training, its impact on their health decision making, their perceptions of public libraries, and the perceived usability and utility of the two websites highlighted during the training program. Those who completed pre-training knowledge tests were also asked to complete the same exercises at the end of the program.Main Results

  11. Feature-level domain adaptation

    DEFF Research Database (Denmark)

    Kouw, Wouter M.; Van Der Maaten, Laurens J P; Krijthe, Jesse H.

    2016-01-01

    -level domain adaptation (flda), that models the dependence between the two domains by means of a feature-level transfer model that is trained to describe the transfer from source to target domain. Subsequently, we train a domain-adapted classifier by minimizing the expected loss under the resulting transfer...... modeled via a dropout distribution, which allows the classiffier to adapt to differences in the marginal probability of features in the source and the target domain. Our experiments on several real-world problems show that flda performs on par with state-of-the-art domainadaptation techniques.......Domain adaptation is the supervised learning setting in which the training and test data are sampled from different distributions: training data is sampled from a source domain, whilst test data is sampled from a target domain. This paper proposes and studies an approach, called feature...

  12. Compensating for Incomplete Domain Knowledge

    National Research Council Canada - National Science Library

    Scott, Lynn M; Drezner, Steve; Rue, Rachel; Reyes, Jesse

    2007-01-01

    .... First, many senior leader positions require experience in more than one functional or operational domain, but it is difficult to develop a corps of senior leaders with all the required combinations of domain knowledge...

  13. Ligand binding by PDZ domains

    DEFF Research Database (Denmark)

    Chi, Celestine N.; Bach, Anders; Strømgaard, Kristian

    2012-01-01

    , for example, are particularly rich in these domains. The general function of PDZ domains is to bring proteins together within the appropriate cellular compartment, thereby facilitating scaffolding, signaling, and trafficking events. The many functions of PDZ domains under normal physiological as well...... as pathological conditions have been reviewed recently. In this review, we focus on the molecular details of how PDZ domains bind their protein ligands and their potential as drug targets in this context....

  14. Summarization by domain ontology navigation

    DEFF Research Database (Denmark)

    Andreasen, Troels; Bulskov, Henrik

    2013-01-01

    of the subject. In between these two extremes, conceptual summaries encompass selected concepts derived using background knowledge. We address in this paper an approach where conceptual summaries are provided through a conceptualization as given by an ontology. The ontology guiding the summarization can...... be a simple taxonomy or a generative domain ontology. A domain ontology can be provided by a preanalysis of a domain corpus and can be used to condense improved summaries that better reflects the conceptualization of a given domain....

  15. Cloud Computing: An Overview

    Science.gov (United States)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  16. Domain Theory for Concurrency

    DEFF Research Database (Denmark)

    Nygaard, Mikkel

    and associated comonads, it highlights the role of linearity in concurrent computation. Two choices of comonad yield two expressive metalanguages for higher-order processes, both arising from canonical constructions in the model. Their denotational semantics are fully abstract with respect to contextual...... equivalence. One language, called HOPLA for Higher-Order Process LAnguage, derives from an exponential of linear logic. It can be viewed as an extension of the simply-typed lambda calculus with CCS-like nondeterministic sum and prefix operations, in which types express the form of computation path of which...

  17. Domain Specific Language Support for Exascale

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2017-10-20

    A multi-institutional project known as D-TEC (short for “Domain- specific Technology for Exascale Computing”) set out to explore technologies to support the construction of Domain Specific Languages (DSLs) to map application programs to exascale architectures. DSLs employ automated code transformation to shift the burden of delivering portable performance from application programmers to compilers. Two chief properties contribute: DSLs permit expression at a high level of abstraction so that a programmer’s intent is clear to a compiler and DSL implementations encapsulate human domain-specific optimization knowledge so that a compiler can be smart enough to achieve good results on specific hardware. Domain specificity is what makes these properties possible in a programming language. If leveraging domain specificity is the key to keep exascale software tractable, a corollary is that many different DSLs will be needed to encompass the full range of exascale computing applications; moreover, a single application may well need to use several different DSLs in conjunction. As a result, developing a general toolkit for building domain-specific languages was a key goal for the D-TEC project. Different aspects of the D-TEC research portfolio were the focus of work at each of the partner institutions in the multi-institutional project. D-TEC research and development work at Rice University focused on on three principal topics: understanding how to automate the tuning of code for complex architectures, research and development of the Rosebud DSL engine, and compiler technology to support complex execution platforms. This report provides a summary of the research and development work on the D-TEC project at Rice University.

  18. Artificial intelligence and tutoring systems computational and cognitive approaches to the communication of knowledge

    CERN Document Server

    Wenger, Etienne

    2014-01-01

    Artificial Intelligence and Tutoring Systems: Computational and Cognitive Approaches to the Communication of Knowledge focuses on the cognitive approaches, methodologies, principles, and concepts involved in the communication of knowledge. The publication first elaborates on knowledge communication systems, basic issues, and tutorial dialogues. Concerns cover natural reasoning and tutorial dialogues, shift from local strategies to multiple mental models, domain knowledge, pedagogical knowledge, implicit versus explicit encoding of knowledge, knowledge communication, and practical and theoretic

  19. Expansion of protein domain repeats.

    Directory of Open Access Journals (Sweden)

    Asa K Björklund

    2006-08-01

    Full Text Available Many proteins, especially in eukaryotes, contain tandem repeats of several domains from the same family. These repeats have a variety of binding properties and are involved in protein-protein interactions as well as binding to other ligands such as DNA and RNA. The rapid expansion of protein domain repeats is assumed to have evolved through internal tandem duplications. However, the exact mechanisms behind these tandem duplications are not well-understood. Here, we have studied the evolution, function, protein structure, gene structure, and phylogenetic distribution of domain repeats. For this purpose we have assigned Pfam-A domain families to 24 proteomes with more sensitive domain assignments in the repeat regions. These assignments confirmed previous findings that eukaryotes, and in particular vertebrates, contain a much higher fraction of proteins with repeats compared with prokaryotes. The internal sequence similarity in each protein revealed that the domain repeats are often expanded through duplications of several domains at a time, while the duplication of one domain is less common. Many of the repeats appear to have been duplicated in the middle of the repeat region. This is in strong contrast to the evolution of other proteins that mainly works through additions of single domains at either terminus. Further, we found that some domain families show distinct duplication patterns, e.g., nebulin domains have mainly been expanded with a unit of seven domains at a time, while duplications of other domain families involve varying numbers of domains. Finally, no common mechanism for the expansion of all repeats could be detected. We found that the duplication patterns show no dependence on the size of the domains. Further, repeat expansion in some families can possibly be explained by shuffling of exons. However, exon shuffling could not have created all repeats.

  20. Slang: A Male Domain?

    Science.gov (United States)

    de Klerk, Vivian

    1990-01-01

    A Grahamstown (South Africa) survey determining the number of slang words known by 12- to 17-year-old public and private school students demonstrates that age, not sex, is the more significant variable, although school type is also important. Predicts that slang usage by girls may soon equal that of boys. (DM)

  1. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  2. Cloud Computing for radiologists

    International Nuclear Information System (INIS)

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future

  3. Cloud computing for radiologists

    Directory of Open Access Journals (Sweden)

    Amit T Kharat

    2012-01-01

    Full Text Available Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  4. Computation of integral bases

    NARCIS (Netherlands)

    Bauch, J.H.P.

    2015-01-01

    Let $A$ be a Dedekind domain, $K$ the fraction field of $A$, and $f\\in A[x]$ a monic irreducible separable polynomial. For a given non-zero prime ideal $\\mathfrak{p}$ of $A$ we present in this paper a new method to compute a $\\mathfrak{p}$-integral basis of the extension of $K$ determined by $f$.

  5. The Education Value of Cloud Computing

    Science.gov (United States)

    Katzan, Harry, Jr.

    2010-01-01

    Cloud computing is a technique for supplying computer facilities and providing access to software via the Internet. Cloud computing represents a contextual shift in how computers are provisioned and accessed. One of the defining characteristics of cloud software service is the transfer of control from the client domain to the service provider.…

  6. Time-Domain Simulation of RF Couplers

    International Nuclear Information System (INIS)

    Smithe, David; Carlsson, Johan; Austin, Travis

    2009-01-01

    We have developed a finite-difference time-domain (FDTD) fluid-like approach to integrated plasma-and-coupler simulation [1], and show how it can be used to model LH and ICRF couplers in the MST and larger tokamaks.[2] This approach permits very accurate 3-D representation of coupler geometry, and easily includes non-axi-symmetry in vessel wall, magnetic equilibrium, and plasma density. The plasma is integrated with the FDTD Maxwell solver in an implicit solve that steps over electron time-scales, and permits tenuous plasma in the coupler itself, without any need to distinguish or interface between different regions of vacuum and/or plasma. The FDTD algorithm is also generalized to incorporate a time-domain sheath potential [3] on metal structures within the simulation, to look for situations where the sheath potential might generate local sputtering opportunities. Benchmarking of the time-domain sheath algorithm has been reported in the references. Finally, the time-domain software [4] permits the use of particles, either as field diagnostic (test particles) or to self-consistently compute plasma current from the applied RF power.

  7. Astrocyte mega-domain hypothesis of the autistic savantism.

    Science.gov (United States)

    Mitterauer, Bernhard J

    2013-01-01

    Individuals with autism who show high abilities are called savants. Whereas in their brains a disconnection in and between neural networks has been identified, savantism is yet poorly understood. Focusing on astrocyte domain organization, it is hypothesized that local astrocyte mega-organizations may be responsible for exerting high capabilities in brains of autistic savants. Astrocytes, the dominant glial cell type, modulate synaptic information transmission. Each astrocyte is organized in non-overlapping domains. Formally, each astrocyte contacting n-neurons with m-synapses via its processes generates dynamic domains of synaptic interactions based on qualitative computation criteria, and hereby it structures neuronal information processing. If the number of processes is genetically significantly increased, these astrocytes operate in a mega-domain with a higher complexitiy of computation. From this model savant abilities are deduced. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Downloadable Computational Toxicology Data

    Science.gov (United States)

    EPA’s computational toxicology research generates data that investigates the potential harm, or hazard of a chemical, the degree of exposure to chemicals as well as the unique chemical characteristics. This data is publicly available here.

  9. Time Domain Induced Polarization

    DEFF Research Database (Denmark)

    Fiandaca, Gianluca; Auken, Esben; Christiansen, Anders Vest

    2012-01-01

    Time-domain-induced polarization has significantly broadened its field of reference during the last decade, from mineral exploration to environmental geophysics, e.g., for clay and peat identification and landfill characterization. Though, insufficient modeling tools have hitherto limited the use...... of time-domaininduced polarization for wider purposes. For these reasons, a new forward code and inversion algorithm have been developed using the full-time decay of the induced polarization response, together with an accurate description of the transmitter waveform and of the receiver transfer function......, to reconstruct the distribution of the Cole-Cole parameters of the earth. The accurate modeling of the transmitter waveform had a strong influence on the forward response, and we showed that the difference between a solution using a step response and a solution using the accurate modeling often is above 100...

  10. Screen time by different devices in adolescents: association with physical inactivity domains and eating habits.

    Science.gov (United States)

    Delfino, Leandro D; Dos Santos Silva, Diego A; Tebar, William R; Zanuto, Edner F; Codogno, Jamile S; Fernandes, Rômulo A; Christofaro, Diego G

    2018-03-01

    Sedentary behaviors in adolescents are associated with using screen devices, analyzed as the total daily time in television viewing, using the computer and video game. However, an independent and clustered analysis of devices allows greater understanding of associations with physical inactivity domains and eating habits in adolescents. Sample of adolescents aged 10-17 years (N.=1011) from public and private schools, randomly selected. The use of screen devices was measured by hours per week spent in each device: TV, computer, videogames and mobile phone/tablet. Physical inactivity domains (school, leisure and sports), eating habits (weekly food consumption frequency) and socioeconomic status were assessed by questionnaire. The prevalence of high use of mobile phone/tablet was 70% among adolescents, 63% showed high use of TV or computer and 24% reported high use of videogames. High use of videogames was greater among boys and high use of mobile phone/tablet was higher among girls. Significant associations of high use of TV (OR=1.43, 95% CI: 1.04-1.99), computer (OR=1.44, 95% CI: 1.03-2.02), videogames (OR=1.65, 95% CI: 1.13-2.69) and consumption of snacks were observed. High use of computer was associated with fried foods consumption (OR=1.32, 95% CI: 1.01-1.75) and physical inactivity (OR=1.41, 95% CI: 1.03-1.95). Mobile phone was associated with consumption of sweets (OR=1.33, 95% CI: 1.00-1.80). Cluster using screen devices showed associations with high consumption of snacks, fried foods and sweets, even after controlling for confounding variables. The high use of screen devices was associated with high consumption of snacks, fried foods, sweets and physical inactivity in adolescents.

  11. Domain architecture conservation in orthologs

    Science.gov (United States)

    2011-01-01

    Background As orthologous proteins are expected to retain function more often than other homologs, they are often used for functional annotation transfer between species. However, ortholog identification methods do not take into account changes in domain architecture, which are likely to modify a protein's function. By domain architecture we refer to the sequential arrangement of domains along a protein sequence. To assess the level of domain architecture conservation among orthologs, we carried out a large-scale study of such events between human and 40 other species spanning the entire evolutionary range. We designed a score to measure domain architecture similarity and used it to analyze differences in domain architecture conservation between orthologs and paralogs relative to the conservation of primary sequence. We also statistically characterized the extents of different types of domain swapping events across pairs of orthologs and paralogs. Results The analysis shows that orthologs exhibit greater domain architecture conservation than paralogous homologs, even when differences in average sequence divergence are compensated for, for homologs that have diverged beyond a certain threshold. We interpret this as an indication of a stronger selective pressure on orthologs than paralogs to retain the domain architecture required for the proteins to perform a specific function. In general, orthologs as well as the closest paralogous homologs have very similar domain architectures, even at large evolutionary separation. The most common domain architecture changes observed in both ortholog and paralog pairs involved insertion/deletion of new domains, while domain shuffling and segment duplication/deletion were very infrequent. Conclusions On the whole, our results support the hypothesis that function conservation between orthologs demands higher domain architecture conservation than other types of homologs, relative to primary sequence conservation. This supports the

  12. Applications of Computer Algebra Conference

    CERN Document Server

    Martínez-Moro, Edgar

    2017-01-01

    The Applications of Computer Algebra (ACA) conference covers a wide range of topics from Coding Theory to Differential Algebra to Quantam Computing, focusing on the interactions of these and other areas with the discipline of Computer Algebra. This volume provides the latest developments in the field as well as its applications in various domains, including communications, modelling, and theoretical physics. The book will appeal to researchers and professors of computer algebra, applied mathematics, and computer science, as well as to engineers and computer scientists engaged in research and development.

  13. Domain decomposition method for solving the neutron diffusion equation

    International Nuclear Information System (INIS)

    Coulomb, F.

    1989-03-01

    The aim of this work is to study methods for solving the neutron diffusion equation; we are interested in methods based on a classical finite element discretization and well suited for use on parallel computers. Domain decomposition methods seem to answer this preoccupation. This study deals with a decomposition of the domain. A theoretical study is carried out for Lagrange finite elements and some examples are given; in the case of mixed dual finite elements, the study is based on examples [fr

  14. International Developments in Computer Science.

    Science.gov (United States)

    1982-06-01

    background on 52 53 China’s scientific research and on their computer science before 1978. A useful companion to the directory is another publication of the...bimonthly publication in Portuguese; occasional translation of foreign articles into Portuguese. Data News: A bimonthly industry newsletter. Sistemas ...computer-related topics; Spanish. Delta: Publication of local users group; Spanish. Sistemas : Publication of System Engineers of Colombia; Spanish. CUBA

  15. Cloud identification using genetic algorithms and massively parallel computation

    Science.gov (United States)

    Buckles, Bill P.; Petry, Frederick E.

    1996-01-01

    As a Guest Computational Investigator under the NASA administered component of the High Performance Computing and Communication Program, we implemented a massively parallel genetic algorithm on the MasPar SIMD computer. Experiments were conducted using Earth Science data in the domains of meteorology and oceanography. Results obtained in these domains are competitive with, and in most cases better than, similar problems solved using other methods. In the meteorological domain, we chose to identify clouds using AVHRR spectral data. Four cloud speciations were used although most researchers settle for three. Results were remarkedly consistent across all tests (91% accuracy). Refinements of this method may lead to more timely and complete information for Global Circulation Models (GCMS) that are prevalent in weather forecasting and global environment studies. In the oceanographic domain, we chose to identify ocean currents from a spectrometer having similar characteristics to AVHRR. Here the results were mixed (60% to 80% accuracy). Given that one is willing to run the experiment several times (say 10), then it is acceptable to claim the higher accuracy rating. This problem has never been successfully automated. Therefore, these results are encouraging even though less impressive than the cloud experiment. Successful conclusion of an automated ocean current detection system would impact coastal fishing, naval tactics, and the study of micro-climates. Finally we contributed to the basic knowledge of GA (genetic algorithm) behavior in parallel environments. We developed better knowledge of the use of subpopulations in the context of shared breeding pools and the migration of individuals. Rigorous experiments were conducted based on quantifiable performance criteria. While much of the work confirmed current wisdom, for the first time we were able to submit conclusive evidence. The software developed under this grant was placed in the public domain. An extensive user

  16. Scaling properties of domain wall networks

    International Nuclear Information System (INIS)

    Leite, A. M. M.; Martins, C. J. A. P.

    2011-01-01

    We revisit the cosmological evolution of domain wall networks, taking advantage of recent improvements in computing power. We carry out high-resolution field theory simulations in two, three and four spatial dimensions to study the effects of dimensionality and damping on the evolution of the network. Our results are consistent with the expected scale-invariant evolution of the network, which suggests that previous hints of deviations from this behavior may have been due to the limited dynamical range of those simulations. We also use the results of very large (1024 3 ) simulations in three cosmological epochs to provide a calibration for the velocity-dependent one-scale model for domain walls: we numerically determine the two free model parameters to have the values c w =0.5±0.2 and k w =1.1±0.3.

  17. Multiple Shooting and Time Domain Decomposition Methods

    CERN Document Server

    Geiger, Michael; Körkel, Stefan; Rannacher, Rolf

    2015-01-01

    This book offers a comprehensive collection of the most advanced numerical techniques for the efficient and effective solution of simulation and optimization problems governed by systems of time-dependent differential equations. The contributions present various approaches to time domain decomposition, focusing on multiple shooting and parareal algorithms.  The range of topics covers theoretical analysis of the methods, as well as their algorithmic formulation and guidelines for practical implementation. Selected examples show that the discussed approaches are mandatory for the solution of challenging practical problems. The practicability and efficiency of the presented methods is illustrated by several case studies from fluid dynamics, data compression, image processing and computational biology, giving rise to possible new research topics.  This volume, resulting from the workshop Multiple Shooting and Time Domain Decomposition Methods, held in Heidelberg in May 2013, will be of great interest to applied...

  18. Modeling Network Traffic in Wavelet Domain

    Directory of Open Access Journals (Sweden)

    Sheng Ma

    2004-12-01

    Full Text Available This work discovers that although network traffic has the complicated short- and long-range temporal dependence, the corresponding wavelet coefficients are no longer long-range dependent. Therefore, a "short-range" dependent process can be used to model network traffic in the wavelet domain. Both independent and Markov models are investigated. Theoretical analysis shows that the independent wavelet model is sufficiently accurate in terms of the buffer overflow probability for Fractional Gaussian Noise traffic. Any model, which captures additional correlations in the wavelet domain, only improves the performance marginally. The independent wavelet model is then used as a unified approach to model network traffic including VBR MPEG video and Ethernet data. The computational complexity is O(N for developing such wavelet models and generating synthesized traffic of length N, which is among the lowest attained.

  19. Protein domain organisation: adding order

    Directory of Open Access Journals (Sweden)

    Kummerfeld Sarah K

    2009-01-01

    Full Text Available Abstract Background Domains are the building blocks of proteins. During evolution, they have been duplicated, fused and recombined, to produce proteins with novel structures and functions. Structural and genome-scale studies have shown that pairs or groups of domains observed together in a protein are almost always found in only one N to C terminal order and are the result of a single recombination event that has been propagated by duplication of the multi-domain unit. Previous studies of domain organisation have used graph theory to represent the co-occurrence of domains within proteins. We build on this approach by adding directionality to the graphs and connecting nodes based on their relative order in the protein. Most of the time, the linear order of domains is conserved. However, using the directed graph representation we have identified non-linear features of domain organization that are over-represented in genomes. Recognising these patterns and unravelling how they have arisen may allow us to understand the functional relationships between domains and understand how the protein repertoire has evolved. Results We identify groups of domains that are not linearly conserved, but instead have been shuffled during evolution so that they occur in multiple different orders. We consider 192 genomes across all three kingdoms of life and use domain and protein annotation to understand their functional significance. To identify these features and assess their statistical significance, we represent the linear order of domains in proteins as a directed graph and apply graph theoretical methods. We describe two higher-order patterns of domain organisation: clusters and bi-directionally associated domain pairs and explore their functional importance and phylogenetic conservation. Conclusion Taking into account the order of domains, we have derived a novel picture of global protein organization. We found that all genomes have a higher than expected

  20. Protein domain organisation: adding order.

    Science.gov (United States)

    Kummerfeld, Sarah K; Teichmann, Sarah A

    2009-01-29

    Domains are the building blocks of proteins. During evolution, they have been duplicated, fused and recombined, to produce proteins with novel structures and functions. Structural and genome-scale studies have shown that pairs or groups of domains observed together in a protein are almost always found in only one N to C terminal order and are the result of a single recombination event that has been propagated by duplication of the multi-domain unit. Previous studies of domain organisation have used graph theory to represent the co-occurrence of domains within proteins. We build on this approach by adding directionality to the graphs and connecting nodes based on their relative order in the protein. Most of the time, the linear order of domains is conserved. However, using the directed graph representation we have identified non-linear features of domain organization that are over-represented in genomes. Recognising these patterns and unravelling how they have arisen may allow us to understand the functional relationships between domains and understand how the protein repertoire has evolved. We identify groups of domains that are not linearly conserved, but instead have been shuffled during evolution so that they occur in multiple different orders. We consider 192 genomes across all three kingdoms of life and use domain and protein annotation to understand their functional significance. To identify these features and assess their statistical significance, we represent the linear order of domains in proteins as a directed graph and apply graph theoretical methods. We describe two higher-order patterns of domain organisation: clusters and bi-directionally associated domain pairs and explore their functional importance and phylogenetic conservation. Taking into account the order of domains, we have derived a novel picture of global protein organization. We found that all genomes have a higher than expected degree of clustering and more domain pairs in forward and

  1. Prediction Reweighting for Domain Adaptation.

    Science.gov (United States)

    Shuang Li; Shiji Song; Gao Huang

    2017-07-01

    There are plenty of classification methods that perform well when training and testing data are drawn from the same distribution. However, in real applications, this condition may be violated, which causes degradation of classification accuracy. Domain adaptation is an effective approach to address this problem. In this paper, we propose a general domain adaptation framework from the perspective of prediction reweighting, from which a novel approach is derived. Different from the major domain adaptation methods, our idea is to reweight predictions of the training classifier on testing data according to their signed distance to the domain separator, which is a classifier that distinguishes training data (from source domain) and testing data (from target domain). We then propagate the labels of target instances with larger weights to ones with smaller weights by introducing a manifold regularization method. It can be proved that our reweighting scheme effectively brings the source and target domains closer to each other in an appropriate sense, such that classification in target domain becomes easier. The proposed method can be implemented efficiently by a simple two-stage algorithm, and the target classifier has a closed-form solution. The effectiveness of our approach is verified by the experiments on artificial datasets and two standard benchmarks, a visual object recognition task and a cross-domain sentiment analysis of text. Experimental results demonstrate that our method is competitive with the state-of-the-art domain adaptation algorithms.

  2. Applications of computational intelligence in nuclear reactors

    International Nuclear Information System (INIS)

    Jayalal, M.L.; Jehadeesan, R.

    2016-01-01

    Computational intelligence techniques have been successfully employed in a wide range of applications which include the domains of medical, bioinformatics, electronics, communications and business. There has been progress in applying of computational intelligence in the nuclear reactor domain during the last two decades. The stringent nuclear safety regulations pertaining to reactor environment present challenges in the application of computational intelligence in various nuclear sub-systems. The applications of various methods of computational intelligence in the domain of nuclear reactors are discussed in this paper. (author)

  3. A role for chromatin topology in imprinted domain regulation.

    Science.gov (United States)

    MacDonald, William A; Sachani, Saqib S; White, Carlee R; Mann, Mellissa R W

    2016-02-01

    Recently, many advancements in genome-wide chromatin topology and nuclear architecture have unveiled the complex and hidden world of the nucleus, where chromatin is organized into discrete neighbourhoods with coordinated gene expression. This includes the active and inactive X chromosomes. Using X chromosome inactivation as a working model, we utilized publicly available datasets together with a literature review to gain insight into topologically associated domains, lamin-associated domains, nucleolar-associating domains, scaffold/matrix attachment regions, and nucleoporin-associated chromatin and their role in regulating monoallelic expression. Furthermore, we comprehensively review for the first time the role of chromatin topology and nuclear architecture in the regulation of genomic imprinting. We propose that chromatin topology and nuclear architecture are important regulatory mechanisms for directing gene expression within imprinted domains. Furthermore, we predict that dynamic changes in chromatin topology and nuclear architecture play roles in tissue-specific imprint domain regulation during early development and differentiation.

  4. Public Values

    DEFF Research Database (Denmark)

    Beck Jørgensen, Torben; Rutgers, Mark R.

    2015-01-01

    administration is approached in terms of processes guided or restricted by public values and as public value creating: public management and public policy-making are both concerned with establishing, following and realizing public values. To study public values a broad perspective is needed. The article suggest......This article provides the introduction to a symposium on contemporary public values research. It is argued that the contribution to this symposium represent a Public Values Perspective, distinct from other specific lines of research that also use public value as a core concept. Public...... a research agenda for this encompasing kind of public values research. Finally the contributions to the symposium are introduced....

  5. Cloud computing and services science

    NARCIS (Netherlands)

    Ivanov, Ivan; van Sinderen, Marten J.; Shishkov, Boris

    2012-01-01

    This book is essentially a collection of the best papers of the International Conference on Cloud Computing and Services Science (CLOSER), which was held in Noordwijkerhout, The Netherlands on May 7–9, 2011. The conference addressed technology trends in the domain of cloud computing in relation to a

  6. Computer Software Reviews.

    Science.gov (United States)

    Hawaii State Dept. of Education, Honolulu. Office of Instructional Services.

    Intended to provide guidance in the selection of the best computer software available to support instruction and to make optimal use of schools' financial resources, this publication provides a listing of computer software programs that have been evaluated according to their currency, relevance, and value to Hawaii's educational programs. The…

  7. Multifunctionalities driven by ferroic domains

    Science.gov (United States)

    Yang, J. C.; Huang, Y. L.; He, Q.; Chu, Y. H.

    2014-08-01

    Considerable attention has been paid to ferroic systems in pursuit of advanced applications in past decades. Most recently, the emergence and development of multiferroics, which exhibit the coexistence of different ferroic natures, has offered a new route to create functionalities in the system. In this manuscript, we step from domain engineering to explore a roadmap for discovering intriguing phenomena and multifunctionalities driven by periodic domain patters. As-grown periodic domains, offering exotic order parameters, periodic local perturbations and the capability of tailoring local spin, charge, orbital and lattice degrees of freedom, are introduced as modeling templates for fundamental studies and novel applications. We discuss related significant findings on ferroic domain, nanoscopic domain walls, and conjunct heterostructures based on the well-organized domain patterns, and end with future prospects and challenges in the field.

  8. Computational tools and resources for metabolism-related property predictions. 1. Overview of publicly available (free and commercial) databases and software.

    Science.gov (United States)

    Peach, Megan L; Zakharov, Alexey V; Liu, Ruifeng; Pugliese, Angelo; Tawa, Gregory; Wallqvist, Anders; Nicklaus, Marc C

    2012-10-01

    Metabolism has been identified as a defining factor in drug development success or failure because of its impact on many aspects of drug pharmacology, including bioavailability, half-life and toxicity. In this article, we provide an outline and descriptions of the resources for metabolism-related property predictions that are currently either freely or commercially available to the public. These resources include databases with data on, and software for prediction of, several end points: metabolite formation, sites of metabolic transformation, binding to metabolizing enzymes and metabolic stability. We attempt to place each tool in historical context and describe, wherever possible, the data it was based on. For predictions of interactions with metabolizing enzymes, we show a typical set of results for a small test set of compounds. Our aim is to give a clear overview of the areas and aspects of metabolism prediction in which the currently available resources are useful and accurate, and the areas in which they are inadequate or missing entirely.

  9. Mapping the Moral Domain

    Science.gov (United States)

    Graham, Jesse; Nosek, Brian A.; Haidt, Jonathan; Iyer, Ravi; Koleva, Spassena; Ditto, Peter H.

    2010-01-01

    The moral domain is broader than the empathy and justice concerns assessed by existing measures of moral competence, and it is not just a subset of the values assessed by value inventories. To fill the need for reliable and theoretically-grounded measurement of the full range of moral concerns, we developed the Moral Foundations Questionnaire (MFQ) based on a theoretical model of five universally available (but variably developed) sets of moral intuitions: Harm/care, Fairness/reciprocity, Ingroup/loyalty, Authority/respect, and Purity/sanctity. We present evidence for the internal and external validity of the scale and the model, and in doing so present new findings about morality: 1. Comparative model fitting of confirmatory factor analyses provides empirical justification for a five-factor structure of moral concerns. 2. Convergent/discriminant validity evidence suggests that moral concerns predict personality features and social group attitudes not previously considered morally relevant. 3. We establish pragmatic validity of the measure in providing new knowledge and research opportunities concerning demographic and cultural differences in moral intuitions. These analyses provide evidence for the usefulness of Moral Foundations Theory in simultaneously increasing the scope and sharpening the resolution of psychological views of morality. PMID:21244182

  10. Research in computer forensics

    OpenAIRE

    Wai, Hor Cheong

    2002-01-01

    Approved for public release; distribution is unlimited Computer Forensics involves the preservation, identification, extraction and documentation of computer evidence stored in the form of magnetically encoded information. With the proliferation of E-commerce initiatives and the increasing criminal activities on the web, this area of study is catching on in the IT industry and among the law enforcement agencies. The objective of the study is to explore the techniques of computer forensics ...

  11. AN INTELLIGENT CONVERSATION AGENT FOR HEALTH CARE DOMAIN

    Directory of Open Access Journals (Sweden)

    K. Karpagam

    2014-04-01

    Full Text Available Human Computer Interaction is one of the pervasive application areas of computer science to develop with multimodal interaction for information sharings. The conversation agent acts as the major core area for developing interfaces between a system and user with applied AI for proper responses. In this paper, the interactive system plays a vital role in improving knowledge in the domain of health through the intelligent interface between machine and human with text and speech. The primary aim is to enrich the knowledge and help the user in the domain of health using conversation agent to offer immediate response with human companion feel.

  12. Framing Effects: Dynamics and Task Domains

    Science.gov (United States)

    Wang

    1996-11-01

    The author examines the mechanisms and dynamics of framing effects in risky choices across three distinct task domains (i.e., life-death, public property, and personal money). The choice outcomes of the problems presented in each of the three task domains had a binary structure of a sure thing vs a gamble of equal expected value; the outcomes differed in their framing conditions and the expected values, raging from 6000, 600, 60, to 6, numerically. It was hypothesized that subjects would become more risk seeking, if the sure outcome was below their aspiration level (the minimum requirement). As predicted, more subjects preferred the gamble when facing the life-death choice problems than facing the counterpart problems presented in the other two task domains. Subjects' risk preference varied categorically along the group size dimension in the life-death domain but changed more linearly over the expected value dimension in the monetary domain. Framing effects were observed in 7 of 13 pairs of problems, showing a positive frame-risk aversion and negative frame-risk seeking relationship. In addition, two types of framing effects were theoretically defined and empirically identified. A bidirectional framing effect involves a reversal in risk preference, and occurs when a decision maker's risk preference is ambiguous or weak. Four bidirectional effects were observed; in each case a majority of subjects preferred the sure outcome under a positive frame but the gamble under a negative frame. In contrast, a unidirectional framing effect refers to a preference shift due to the framing of choice outcomes: A majority of subjects preferred one choice outcome (either the sure thing or the gamble) under both framing conditions, with positive frame augmented the preference for the sure thing and negative frame augmented the preference for the gamble. These findings revealed some dynamic regularities of framing effects and posed implications for developing predictive and testable

  13. Topological domain walls in helimagnets

    Science.gov (United States)

    Schoenherr, P.; Müller, J.; Köhler, L.; Rosch, A.; Kanazawa, N.; Tokura, Y.; Garst, M.; Meier, D.

    2018-05-01

    Domain walls naturally arise whenever a symmetry is spontaneously broken. They interconnect regions with different realizations of the broken symmetry, promoting structure formation from cosmological length scales to the atomic level1,2. In ferroelectric and ferromagnetic materials, domain walls with unique functionalities emerge, holding great promise for nanoelectronics and spintronics applications3-5. These walls are usually of Ising, Bloch or Néel type and separate homogeneously ordered domains. Here we demonstrate that a wide variety of new domain walls occurs in the presence of spatially modulated domain states. Using magnetic force microscopy and micromagnetic simulations, we show three fundamental classes of domain walls to arise in the near-room-temperature helimagnet iron germanium. In contrast to conventional ferroics, the domain walls exhibit a well-defined inner structure, which—analogous to cholesteric liquid crystals—consists of topological disclination and dislocation defects. Similar to the magnetic skyrmions that form in the same material6,7, the domain walls can carry a finite topological charge, permitting an efficient coupling to spin currents and contributions to a topological Hall effect. Our study establishes a new family of magnetic nano-objects with non-trivial topology, opening the door to innovative device concepts based on helimagnetic domain walls.

  14. Convectons in periodic and bounded domains

    International Nuclear Information System (INIS)

    Mercader, Isabel; Batiste, Oriol; Alonso, Arantxa; Knobloch, Edgar

    2010-01-01

    Numerical continuation is used to compute spatially localized convection in a binary fluid with no-slip laterally insulating boundary conditions and the results are compared with the corresponding ones for periodic boundary conditions (PBC). The change in the boundary conditions produces a dramatic change in the snaking bifurcation diagram that describes the organization of localized states with PBC: the snaking branches turn continuously into a large amplitude state that resembles periodic convection with defects at the sidewalls. Odd parity convectons are more affected by the boundary conditions since the sidewalls suppress the horizontal pumping action that accompanies these states in spatially periodic domains.

  15. Convectons in periodic and bounded domains

    Energy Technology Data Exchange (ETDEWEB)

    Mercader, Isabel; Batiste, Oriol; Alonso, Arantxa [Departament de Fisica Aplicada, Universitat Politecnica de Catalunya, Barcelona (Spain); Knobloch, Edgar [Department of Physics, University of California, Berkeley, CA 94720 (United States)

    2010-04-15

    Numerical continuation is used to compute spatially localized convection in a binary fluid with no-slip laterally insulating boundary conditions and the results are compared with the corresponding ones for periodic boundary conditions (PBC). The change in the boundary conditions produces a dramatic change in the snaking bifurcation diagram that describes the organization of localized states with PBC: the snaking branches turn continuously into a large amplitude state that resembles periodic convection with defects at the sidewalls. Odd parity convectons are more affected by the boundary conditions since the sidewalls suppress the horizontal pumping action that accompanies these states in spatially periodic domains.

  16. The BRCT domain is a phospho-protein binding domain.

    Science.gov (United States)

    Yu, Xiaochun; Chini, Claudia Christiano Silva; He, Miao; Mer, Georges; Chen, Junjie

    2003-10-24

    The carboxyl-terminal domain (BRCT) of the Breast Cancer Gene 1 (BRCA1) protein is an evolutionarily conserved module that exists in a large number of proteins from prokaryotes to eukaryotes. Although most BRCT domain-containing proteins participate in DNA-damage checkpoint or DNA-repair pathways, or both, the function of the BRCT domain is not fully understood. We show that the BRCA1 BRCT domain directly interacts with phosphorylated BRCA1-Associated Carboxyl-terminal Helicase (BACH1). This specific interaction between BRCA1 and phosphorylated BACH1 is cell cycle regulated and is required for DNA damage-induced checkpoint control during the transition from G2 to M phase of the cell cycle. Further, we show that two other BRCT domains interact with their respective physiological partners in a phosphorylation-dependent manner. Thirteen additional BRCT domains also preferentially bind phospho-peptides rather than nonphosphorylated control peptides. These data imply that the BRCT domain is a phospho-protein binding domain involved in cell cycle control.

  17. Associations of Total and Domain-Specific Sedentary Time With Type 2 Diabetes in Taiwanese Older Adults

    Directory of Open Access Journals (Sweden)

    Ming-Chun Hsueh

    2016-07-01

    Full Text Available Background: The increasing prevalence of type 2 diabetes in older adults has become a public health concern. We investigated the associations of total and domain-specific sedentary time with risk of type 2 diabetes in older adults. Methods: The sample comprised 1046 older people (aged ≥65 years. Analyses were performed using crosssectional data collected via computer-assisted telephone-based interviews in 2014. Data on six self-reported domains of sedentary time (Measure of Older Adults’ Sedentary Time, type 2 diabetes status, and sociodemographic variables were included in the study. Binary logistic regression analysis was performed to calculate the adjusted odds ratios (ORs and 95% confidence intervals (CIs for total and individual sedentary behavior components and likelihood of type 2 diabetes. Results: A total of 17.5% of the participants reported type 2 diabetes. No significant associations were found between total sitting time and risk of type 2 diabetes, after controlling for confounding factors. After total sedentary behavior was stratified into six domains, only watching television for more than 2 hours per day was associated with higher odds of type 2 diabetes (OR 1.56; 95% CI, 1.10–2.21, but no significant associations were found between other domains of sedentary behavior (computer use, reading, socializing, transport, and hobbies and risk of type 2 diabetes. Conclusions: These findings suggest that, among domain-specific sedentary behavior, excessive television viewing might increase the risk of type 2 diabetes among older adults more than other forms of sedentary behavior.

  18. Environmental computing compendium - background and motivation

    Science.gov (United States)

    Heikkurinen, Matti; Kranzlmüller, Dieter

    2017-04-01

    The emerging discipline of environmental computing brings together experts in applied, advanced environmental modelling. The application domains address several fundamental societal challenges, ranging from disaster risk reduction to sustainability issues (such as food security on the global scale). The community has used an Intuitive, pragmatic approach when determining which initiatives are considered to "belong to the discipline". The community's growth is based on sharing of experiences and tools provides opportunities for reusing solutions or applying knowledge in new settings. Thus, limiting possible synergies by applying an arbitrary, formal definition to exclude some of the sources of solutions and knowledge would be counterproductive. However, the number of individuals and initiatives involved has grown to the level where a survey of initiatives and sub-themes they focus on is of interest. By surveying the project landscape and identifying common themes and building a shared vocabulary to describe them we can both communicate the relevance of the new discipline to the general public more easily and make it easier for the new members of the community to find the most promising collaboration partners. This talk presents the methodology and initial findings of the initial survey of the environmental computing initiatives and organisations, as well as approaches that could lead to an environmental computing compendium that would be a collaborative maintained shared resource of the environmental computing community.

  19. Models for randomly distributed nanoscopic domains on spherical vesicles

    Science.gov (United States)

    Anghel, Vinicius N. P.; Bolmatov, Dima; Katsaras, John

    2018-06-01

    The existence of lipid domains in the plasma membrane of biological systems has proven controversial, primarily due to their nanoscopic size—a length scale difficult to interrogate with most commonly used experimental techniques. Scattering techniques have recently proven capable of studying nanoscopic lipid domains populating spherical vesicles. However, the development of analytical methods able of predicting and analyzing domain pair correlations from such experiments has not kept pace. Here, we developed models for the random distribution of monodisperse, circular nanoscopic domains averaged on the surface of a spherical vesicle. Specifically, the models take into account (i) intradomain correlations corresponding to form factors and interdomain correlations corresponding to pair distribution functions, and (ii) the analytical computation of interdomain correlations for cases of two and three domains on a spherical vesicle. In the case of more than three domains, these correlations are treated either by Monte Carlo simulations or by spherical analogs of the Ornstein-Zernike and Percus-Yevick (PY) equations. Importantly, the spherical analog of the PY equation works best in the case of nanoscopic size domains, a length scale that is mostly inaccessible by experimental approaches such as, for example, fluorescent techniques and optical microscopies. The analytical form factors and structure factors of nanoscopic domains populating a spherical vesicle provide a new and important framework for the quantitative analysis of experimental data from commonly studied phase-separated vesicles used in a wide range of biophysical studies.

  20. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    Directory of Open Access Journals (Sweden)

    Karlheinz Schwarz

    2013-09-01

    Full Text Available Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized below. In each section a further focusing will be provided by occasionally organizing special issues on topics of high interests, collecting papers on fundamental work in the field. More applied papers should be submitted to their corresponding specialist journals. To help us achieve our goal with this journal, we have an excellent editorial board to advise us on the exciting current and future trends in computation from methodology to application. We very much look forward to hearing all about the research going on across the world. [...

  1. Sierra toolkit computational mesh conceptual model

    International Nuclear Information System (INIS)

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-01-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  2. Online self-report questionnaire on computer work-related exposure (OSCWE): validity and internal consistency.

    Science.gov (United States)

    Mekhora, Keerin; Jalayondeja, Wattana; Jalayondeja, Chutima; Bhuanantanondh, Petcharatana; Dusadiisariyavong, Asadang; Upiriyasakul, Rujiret; Anuraktam, Khajornyod

    2014-07-01

    To develop an online, self-report questionnaire on computer work-related exposure (OSCWE) and to determine the internal consistency, face and content validity of the questionnaire. The online, self-report questionnaire was developed to determine the risk factors related to musculoskeletal disorders in computer users. It comprised five domains: personal, work-related, work environment, physical health and psychosocial factors. The questionnaire's content was validated by an occupational medical doctor and three physical therapy lecturers involved in ergonomic teaching. Twenty-five lay people examined the feasibility of computer-administered and the user-friendly language. The item correlation in each domain was analyzed by the internal consistency (Cronbach's alpha; alpha). The content of the questionnaire was considered congruent with the testing purposes. Eight hundred and thirty-five computer users at the PTT Exploration and Production Public Company Limited registered to the online self-report questionnaire. The internal consistency of the five domains was: personal (alpha = 0.58), work-related (alpha = 0.348), work environment (alpha = 0.72), physical health (alpha = 0.68) and psychosocial factor (alpha = 0.93). The findings suggested that the OSCWE had acceptable internal consistency for work environment and psychosocial factors. The OSCWE is available to use in population-based survey research among computer office workers.

  3. A survey of current trends in computational drug repositioning.

    Science.gov (United States)

    Li, Jiao; Zheng, Si; Chen, Bin; Butte, Atul J; Swamidass, S Joshua; Lu, Zhiyong

    2016-01-01

    Computational drug repositioning or repurposing is a promising and efficient tool for discovering new uses from existing drugs and holds the great potential for precision medicine in the age of big data. The explosive growth of large-scale genomic and phenotypic data, as well as data of small molecular compounds with granted regulatory approval, is enabling new developments for computational repositioning. To achieve the shortest path toward new drug indications, advanced data processing and analysis strategies are critical for making sense of these heterogeneous molecular measurements. In this review, we show recent advancements in the critical areas of computational drug repositioning from multiple aspects. First, we summarize available data sources and the corresponding computational repositioning strategies. Second, we characterize the commonly used computational techniques. Third, we discuss validation strategies for repositioning studies, including both computational and experimental methods. Finally, we highlight potential opportunities and use-cases, including a few target areas such as cancers. We conclude with a brief discussion of the remaining challenges in computational drug repositioning. Published by Oxford University Press 2015. This work is written by US Government employees and is in the public domain in the US.

  4. Resource Unavailability (RU) Per Domain Behavior

    NARCIS (Netherlands)

    Karagiannis, Georgios; Westberg, L.; Bader, A.; Tschofenig, Hannes; Tschofenig, H.

    2006-01-01

    This draft specifies a Per Domain Behavior that provides the ability to Diffserv nodes located outside Diffserv domain(s), e.g., receiver or other Diffserv enabled router to detect when the resources provided by the Diffserv domain(s) are not available. The unavailability of resources in the domain

  5. Architecture for an advanced biomedical collaboration domain for the European paediatric cancer research community (ABCD-4-E).

    Science.gov (United States)

    Nitzlnader, Michael; Falgenhauer, Markus; Gossy, Christian; Schreier, Günter

    2015-01-01

    Today, progress in biomedical research often depends on large, interdisciplinary research projects and tailored information and communication technology (ICT) support. In the context of the European Network for Cancer Research in Children and Adolescents (ENCCA) project the exchange of data between data source (Source Domain) and data consumer (Consumer Domain) systems in a distributed computing environment needs to be facilitated. This work presents the requirements and the corresponding solution architecture of the Advanced Biomedical Collaboration Domain for Europe (ABCD-4-E). The proposed concept utilises public as well as private cloud systems, the Integrating the Healthcare Enterprise (IHE) framework and web-based applications to provide the core capabilities in accordance with privacy and security needs. The utility of crucial parts of the concept was evaluated by prototypic implementation. A discussion of the design indicates that the requirements of ENCCA are fully met. A whole system demonstration is currently being prepared to verify that ABCD-4-E has the potential to evolve into a domain-bridging collaboration platform in the future.

  6. Domain decomposition and multilevel integration for fermions

    International Nuclear Information System (INIS)

    Ce, Marco; Giusti, Leonardo; Schaefer, Stefan

    2016-01-01

    The numerical computation of many hadronic correlation functions is exceedingly difficult due to the exponentially decreasing signal-to-noise ratio with the distance between source and sink. Multilevel integration methods, using independent updates of separate regions in space-time, are known to be able to solve such problems but have so far been available only for pure gauge theory. We present first steps into the direction of making such integration schemes amenable to theories with fermions, by factorizing a given observable via an approximated domain decomposition of the quark propagator. This allows for multilevel integration of the (large) factorized contribution to the observable, while its (small) correction can be computed in the standard way.

  7. Taxonomies of Educational Objective Domain

    OpenAIRE

    Eman Ghanem Nayef; Nik Rosila Nik Yaacob; Hairul Nizam Ismail

    2013-01-01

    This paper highlights an effort to study the educational objective domain taxonomies including Bloom’s taxonomy, Lorin Anderson’s taxonomy, and Wilson’s taxonomy. In this study a comparison among these three taxonomies have been done. Results show that Bloom’s taxonomy is more suitable as an analysis tool to Educational Objective domain.

  8. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and

  9. Texture of lipid bilayer domains

    DEFF Research Database (Denmark)

    Jensen, Uffe Bernchou; Brewer, Jonathan R.; Midtiby, Henrik Skov

    2009-01-01

    We investigate the texture of gel (g) domains in binary lipid membranes composed of the phospholipids DPPC and DOPC. Lateral organization of lipid bilayer membranes is a topic of fundamental and biological importance. Whereas questions related to size and composition of fluid membrane domain...... are well studied, the possibility of texture in gel domains has so far not been examined. When using polarized light for two-photon excitation of the fluorescent lipid probe Laurdan, the emission intensity is highly sensitive to the angle between the polarization and the tilt orientation of lipid acyl...... chains. By imaging the intensity variations as a function of the polarization angle, we map the lateral variations of the lipid tilt within domains. Results reveal that gel domains are composed of subdomains with different lipid tilt directions. We have applied a Fourier decomposition method...

  10. Polar Domain Discovery with Sparkler

    Science.gov (United States)

    Duerr, R.; Khalsa, S. J. S.; Mattmann, C. A.; Ottilingam, N. K.; Singh, K.; Lopez, L. A.

    2017-12-01

    The scientific web is vast and ever growing. It encompasses millions of textual, scientific and multimedia documents describing research in a multitude of scientific streams. Most of these documents are hidden behind forms which require user action to retrieve and thus can't be directly accessed by content crawlers. These documents are hosted on web servers across the world, most often on outdated hardware and network infrastructure. Hence it is difficult and time-consuming to aggregate documents from the scientific web, especially those relevant to a specific domain. Thus generating meaningful domain-specific insights is currently difficult. We present an automated discovery system (Figure 1) using Sparkler, an open-source, extensible, horizontally scalable crawler which facilitates high throughput and focused crawling of documents pertinent to a particular domain such as information about polar regions. With this set of highly domain relevant documents, we show that it is possible to answer analytical questions about that domain. Our domain discovery algorithm leverages prior domain knowledge to reach out to commercial/scientific search engines to generate seed URLs. Subject matter experts then annotate these seed URLs manually on a scale from highly relevant to irrelevant. We leverage this annotated dataset to train a machine learning model which predicts the `domain relevance' of a given document. We extend Sparkler with this model to focus crawling on documents relevant to that domain. Sparkler avoids disruption of service by 1) partitioning URLs by hostname such that every node gets a different host to crawl and by 2) inserting delays between subsequent requests. With an NSF-funded supercomputer Wrangler, we scaled our domain discovery pipeline to crawl about 200k polar specific documents from the scientific web, within a day.

  11. Computer in radiology

    International Nuclear Information System (INIS)

    Kuesters, H.

    1985-01-01

    With this publication, the author presents the requirements that a user specific software should fulfill to reach an effective practice rationalisation through computer usage and the hardware configuration necessary as basic equipment. This should make it more difficult in the future for sales representatives to sell radiologists unusable computer systems. Furthermore, questions shall be answered that were asked by computer interested radiologists during the system presentation. On the one hand there still exists a prejudice against programmes of standard texts and on the other side undefined fears, that handling a computer is to difficult and that one has to learn a computer language first to be able to work with computers. Finally, it i pointed out, the real competitive advantages can be obtained through computer usage. (orig.) [de

  12. The Human-Computer Domain Relation in UX Models

    DEFF Research Database (Denmark)

    Clemmensen, Torkil

    influential lines of UX research: aesthetics and temporal UX, and two use situations: using a website and starting to use a smartphone. The results suggest that the two lines of UX research share a focus on users’ evaluative judgments of technology, both focuses on product qualities rather than activity...

  13. Wake force computation in the time domain for long structures

    International Nuclear Information System (INIS)

    Bane, K.; Weiland, T.

    1983-07-01

    One is often interested in calculating the wake potentials for short bunches in long structures using TBCI. For ultra-relativistic particles it is sufficient to solve for the fields only over a window containing the bunch and moving along with it. This technique reduces both the memory and the running time required by a factor that equals the ratio of the structure length to the window length. For example, for a bunch with sigma/sub z/ of one picosecond traversing a single SLAC cell this improvement factor is 15. It is thus possible to solve for the wakefields in very long structures: for a given problem, increasing the structure length will not change the memory required while only adding linearly to the CPU time needed

  14. Multi-scale and multi-domain computational astrophysics.

    Science.gov (United States)

    van Elteren, Arjen; Pelupessy, Inti; Zwart, Simon Portegies

    2014-08-06

    Astronomical phenomena are governed by processes on all spatial and temporal scales, ranging from days to the age of the Universe (13.8 Gyr) as well as from kilometre size up to the size of the Universe. This enormous range in scales is contrived, but as long as there is a physical connection between the smallest and largest scales it is important to be able to resolve them all, and for the study of many astronomical phenomena this governance is present. Although covering all these scales is a challenge for numerical modellers, the most challenging aspect is the equally broad and complex range in physics, and the way in which these processes propagate through all scales. In our recent effort to cover all scales and all relevant physical processes on these scales, we have designed the Astrophysics Multipurpose Software Environment (AMUSE). AMUSE is a Python-based framework with production quality community codes and provides a specialized environment to connect this plethora of solvers to a homogeneous problem-solving environment. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  15. Domain shape instabilities and dendrite domain growth in uniaxial ferroelectrics

    Science.gov (United States)

    Shur, Vladimir Ya.; Akhmatkhanov, Andrey R.

    2018-01-01

    The effects of domain wall shape instabilities and the formation of nanodomains in front of moving walls obtained in various uniaxial ferroelectrics are discussed. Special attention is paid to the formation of self-assembled nanoscale and dendrite domain structures under highly non-equilibrium switching conditions. All obtained results are considered in the framework of the unified kinetic approach to domain structure evolution based on the analogy with first-order phase transformation. This article is part of the theme issue `From atomistic interfaces to dendritic patterns'.

  16. Separated matter and antimatter domains with vanishing domain walls

    Energy Technology Data Exchange (ETDEWEB)

    Dolgov, A.D.; Godunov, S.I.; Rudenko, A.S.; Tkachev, I.I., E-mail: dolgov@fe.infn.it, E-mail: sgodunov@itep.ru, E-mail: a.s.rudenko@inp.nsk.su, E-mail: tkachev@ms2.inr.ac.ru [Physics Department and Laboratory of Cosmology and Elementary Particle Physics, Novosibirsk State University, Pirogova st. 2, Novosibirsk, 630090 (Russian Federation)

    2015-10-01

    We present a model of spontaneous (or dynamical) C and CP violation where it is possible to generate domains of matter and antimatter separated by cosmologically large distances. Such C(CP) violation existed only in the early universe and later it disappeared with the only trace of generated baryonic and/or antibaryonic domains. So the problem of domain walls in this model does not exist. These features are achieved through a postulated form of interaction between inflaton and a new scalar field, realizing short time C(CP) violation.

  17. Usage of Cloud Computing Simulators and Future Systems For Computational Research

    OpenAIRE

    Lakshminarayanan, Ramkumar; Ramalingam, Rajasekar

    2016-01-01

    Cloud Computing is an Internet based computing, whereby shared resources, software and information, are provided to computers and devices on demand, like the electricity grid. Currently, IaaS (Infrastructure as a Service), PaaS (Platform as a Service) and SaaS (Software as a Service) are used as a business model for Cloud Computing. Nowadays, the adoption and deployment of Cloud Computing is increasing in various domains, forcing researchers to conduct research in the area of Cloud Computing ...

  18. Three-dimensional transient electromagnetic modeling in the Laplace Domain

    International Nuclear Information System (INIS)

    Mizunaga, H.; Lee, Ki Ha; Kim, H.J.

    1998-01-01

    In modeling electromagnetic responses, Maxwell's equations in the frequency domain are popular and have been widely used (Nabighian, 1994; Newman and Alumbaugh, 1995; Smith, 1996, to list a few). Recently, electromagnetic modeling in the time domain using the finite difference (FDTD) method (Wang and Hohmann, 1993) has also been used to study transient electromagnetic interactions in the conductive medium. This paper presents a new technique to compute the electromagnetic response of three-dimensional (3-D) structures. The proposed new method is based on transforming Maxwell's equations to the Laplace domain. For each discrete Laplace variable, Maxwell's equations are discretized in 3-D using the staggered grid and the finite difference method (FDM). The resulting system of equations is then solved for the fields using the incomplete Cholesky conjugate gradient (ICCG) method. The new method is particularly effective in saving computer memory since all the operations are carried out in real numbers. For the same reason, the computing speed is faster than frequency domain modeling. The proposed approach can be an extremely useful tool in developing an inversion algorithm using the time domain data

  19. Improving developer productivity with C++ embedded domain specific languages

    Science.gov (United States)

    Kozacik, Stephen; Chao, Evenie; Paolini, Aaron; Bonnett, James; Kelmelis, Eric

    2017-05-01

    Domain-specific languages are a useful tool for productivity allowing domain experts to program using familiar concepts and vocabulary while benefiting from performance choices made by computing experts. Embedding the domain specific language into an existing language allows easy interoperability with non-domain-specific code and use of standard compilers and build systems. In C++, this is enabled through the template and preprocessor features. C++ embedded domain specific languages (EDSLs) allow the user to write simple, safe, performant, domain specific code that has access to all the low-level functionality that C and C++ offer as well as the diverse set of libraries available in the C/C++ ecosystem. In this paper, we will discuss several tools available for building EDSLs in C++ and show examples of projects successfully leveraging EDSLs. Modern C++ has added many useful new features to the language which we have leveraged to further extend the capability of EDSLs. At EM Photonics, we have used EDSLs to allow developers to transparently benefit from using high performance computing (HPC) hardware. We will show ways EDSLs combine with existing technologies and EM Photonics high performance tools and libraries to produce clean, short, high performance code in ways that were not previously possible.

  20. Helix Nebula: Enabling federation of existing data infrastructures and data services to an overarching cross-domain e-infrastructure

    Science.gov (United States)

    Lengert, Wolfgang; Farres, Jordi; Lanari, Riccardo; Casu, Francesco; Manunta, Michele; Lassalle-Balier, Gerard

    2014-05-01

    Helix Nebula has established a growing public private partnership of more than 30 commercial cloud providers, SMEs, and publicly funded research organisations and e-infrastructures. The Helix Nebula strategy is to establish a federated cloud service across Europe. Three high-profile flagships, sponsored by CERN (high energy physics), EMBL (life sciences) and ESA/DLR/CNES/CNR (earth science), have been deployed and extensively tested within this federated environment. The commitments behind these initial flagships have created a critical mass that attracts suppliers and users to the initiative, to work together towards an "Information as a Service" market place. Significant progress in implementing the following 4 programmatic goals (as outlined in the strategic Plan Ref.1) has been achieved: × Goal #1 Establish a Cloud Computing Infrastructure for the European Research Area (ERA) serving as a platform for innovation and evolution of the overall infrastructure. × Goal #2 Identify and adopt suitable policies for trust, security and privacy on a European-level can be provided by the European Cloud Computing framework and infrastructure. × Goal #3 Create a light-weight governance structure for the future European Cloud Computing Infrastructure that involves all the stakeholders and can evolve over time as the infrastructure, services and user-base grows. × Goal #4 Define a funding scheme involving the three stake-holder groups (service suppliers, users, EC and national funding agencies) into a Public-Private-Partnership model to implement a Cloud Computing Infrastructure that delivers a sustainable business environment adhering to European level policies. Now in 2014 a first version of this generic cross-domain e-infrastructure is ready to go into operations building on federation of European industry and contributors (data, tools, knowledge, ...). This presentation describes how Helix Nebula is being used in the domain of earth science focusing on geohazards. The

  1. Social Computing

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  2. Public lighting.

    NARCIS (Netherlands)

    Schreuder, D.A.

    1986-01-01

    The function of public lighting and the relationship between public lighting and accidents are considered briefly as aspects of effective countermeasures. Research needs and recent developments in installation and operational described. Public lighting is an efficient accident countermeasure, but

  3. N-Terminal Domains in Two-Domain Proteins Are Biased to Be Shorter and Predicted to Fold Faster Than Their C-Terminal Counterparts

    Directory of Open Access Journals (Sweden)

    Etai Jacob

    2013-04-01

    Full Text Available Computational analysis of proteomes in all kingdoms of life reveals a strong tendency for N-terminal domains in two-domain proteins to have shorter sequences than their neighboring C-terminal domains. Given that folding rates are affected by chain length, we asked whether the tendency for N-terminal domains to be shorter than their neighboring C-terminal domains reflects selection for faster-folding N-terminal domains. Calculations of absolute contact order, another predictor of folding rate, provide additional evidence that N-terminal domains tend to fold faster than their neighboring C-terminal domains. A possible explanation for this bias, which is more pronounced in prokaryotes than in eukaryotes, is that faster folding of N-terminal domains reduces the risk for protein aggregation during folding by preventing formation of nonnative interdomain interactions. This explanation is supported by our finding that two-domain proteins with a shorter N-terminal domain are much more abundant than those with a shorter C-terminal domain.

  4. Ferroelectric negative capacitance domain dynamics

    Science.gov (United States)

    Hoffmann, Michael; Khan, Asif Islam; Serrao, Claudy; Lu, Zhongyuan; Salahuddin, Sayeef; Pešić, Milan; Slesazeck, Stefan; Schroeder, Uwe; Mikolajick, Thomas

    2018-05-01

    Transient negative capacitance effects in epitaxial ferroelectric Pb(Zr0.2Ti0.8)O3 capacitors are investigated with a focus on the dynamical switching behavior governed by domain nucleation and growth. Voltage pulses are applied to a series connection of the ferroelectric capacitor and a resistor to directly measure the ferroelectric negative capacitance during switching. A time-dependent Ginzburg-Landau approach is used to investigate the underlying domain dynamics. The transient negative capacitance is shown to originate from reverse domain nucleation and unrestricted domain growth. However, with the onset of domain coalescence, the capacitance becomes positive again. The persistence of the negative capacitance state is therefore limited by the speed of domain wall motion. By changing the applied electric field, capacitor area or external resistance, this domain wall velocity can be varied predictably over several orders of magnitude. Additionally, detailed insights into the intrinsic material properties of the ferroelectric are obtainable through these measurements. A new method for reliable extraction of the average negative capacitance of the ferroelectric is presented. Furthermore, a simple analytical model is developed, which accurately describes the negative capacitance transient time as a function of the material properties and the experimental boundary conditions.

  5. PREFACE: Domain wall dynamics in nanostructures Domain wall dynamics in nanostructures

    Science.gov (United States)

    Marrows, C. H.; Meier, G.

    2012-01-01

    Domain structures in magnetic materials are ubiquitous and have been studied for decades. The walls that separate them are topological defects in the magnetic order parameter and have a wide variety of complex forms. In general, their investigation is difficult in bulk materials since only the domain structure on the surface of a specimen is visible. Cutting the sample to reveal the interior causes a rearrangement of the domains into a new form. As with many other areas of magnetism, the study of domain wall physics has been revitalised by the advent of nanotechnology. The ability to fabricate nanoscale structures has permitted the formation of simplified and controlled domain patterns; the development of advanced microscopy methods has permitted them to be imaged and then modelled; subjecting them to ultrashort field and current pulses has permitted their dynamics to be explored. The latest results from all of these advances are described in this special issue. Not only has this led to results of great scientific beauty, but also to concepts of great applicability to future information technologies. In this issue the reader will find the latest results for these domain wall dynamics and the high-speed processes of topological structures such as domain walls and magnetic vortices. These dynamics can be driven by the application of magnetic fields, or by flowing currents through spintronic devices using the novel physics of spin-transfer torque. This complexity has been studied using a wide variety of experimental techniques at the edge of the spatial and temporal resolution currently available, and can be described using sophisticated analytical theory and computational modelling. As a result, the dynamics can be engineered to give rise to finely controlled memory and logic devices with new functionality. Moreover, the field is moving to study not only the conventional transition metal ferromagnets, but also complex heterostructures, novel magnets and even other

  6. PENGATURAN PASSING OFF DALAM PENGGUNAAN DOMAIN NAME TERKAIT DENGAN MEREK

    Directory of Open Access Journals (Sweden)

    Herti Yunita Putri

    2016-09-01

    Full Text Available In cyber world we often hear about domain name’s term. Domain name is a unique name to identify the server computer’s name like a web server or email server on a computer network or Internet. Passing off also make causes confusion in using merk from a famous brand or merk on the goods and services. Selected domain name in the internet media often creates the similar domain name with the other parties. This similar domain name are often used by people who are not responsible to take advantages of the domain name for themself. This can be caused by the presence of competition from Internet media business. This things called passing off. This research is a normative juridical research with a qualitative analysis. The legal materials include primary legal, secondary law and tertiary legal materials. Collection technique applied is literary study. Legal materials were analyzed to see the argument implementation of the definition of merk, the definition of domain name, definition of passing off, passing off in use related by merk and domain name and the rules of law in Indonesia related by merk, domain name and passing off. Big wishes in the future it can assist as a basic reference and legal considerations which are useful in Indonesian law practice. There are two passing off related to the merk and domain name, called Crybersquatting and Tiposquatting. Domain name rules are not regulated clearly in merk regulation named Act No. 15 of 2001. It regulated in PP 24 Year 1993 about The Class List of Goods or Services In Merk, Telecommunications are included in the goods or services in merk. Domain name are regulated in UDRP (Uniform Dispute Resolution Policy with competent institutions called ICANN (Internet Corporation for Assigned Names and Numbers. Dalam dunia maya (cyber world, kita sering mendengar istilah domain name. Domain name adalah nama unik yang diberikan untuk mengidentifikasi nama server komputer seperti web server atau email server di

  7. Evaluation of need for ontologies to manage domain content for the Reportable Conditions Knowledge Management System.

    Science.gov (United States)

    Eilbeck, Karen L; Lipstein, Julie; McGarvey, Sunanda; Staes, Catherine J

    2014-01-01

    The Reportable Condition Knowledge Management System (RCKMS) is envisioned to be a single, comprehensive, authoritative, real-time portal to author, view and access computable information about reportable conditions. The system is designed for use by hospitals, laboratories, health information exchanges, and providers to meet public health reporting requirements. The RCKMS Knowledge Representation Workgroup was tasked to explore the need for ontologies to support RCKMS functionality. The workgroup reviewed relevant projects and defined criteria to evaluate candidate knowledge domain areas for ontology development. The use of ontologies is justified for this project to unify the semantics used to describe similar reportable events and concepts between different jurisdictions and over time, to aid data integration, and to manage large, unwieldy datasets that evolve, and are sometimes externally managed.

  8. A pseudospectral collocation time-domain method for diffractive optics

    DEFF Research Database (Denmark)

    Dinesen, P.G.; Hesthaven, J.S.; Lynov, Jens-Peter

    2000-01-01

    We present a pseudospectral method for the analysis of diffractive optical elements. The method computes a direct time-domain solution of Maxwell's equations and is applied to solving wave propagation in 2D diffractive optical elements. (C) 2000 IMACS. Published by Elsevier Science B.V. All rights...

  9. Cloud Computing (1/2)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Cloud computing, the recent years buzzword for distributed computing, continues to attract and keep the interest of both the computing and business world. These lectures aim at explaining "What is Cloud Computing?" identifying and analyzing it's characteristics, models, and applications. The lectures will explore different "Cloud definitions" given by different authors and use them to introduce the particular concepts. The main cloud models (SaaS, PaaS, IaaS), cloud types (public, private, hybrid), cloud standards and security concerns will be presented. The borders between Cloud Computing and Grid Computing, Server Virtualization, Utility Computing will be discussed and analyzed.

  10. Cloud Computing (2/2)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Cloud computing, the recent years buzzword for distributed computing, continues to attract and keep the interest of both the computing and business world. These lectures aim at explaining "What is Cloud Computing?" identifying and analyzing it's characteristics, models, and applications. The lectures will explore different "Cloud definitions" given by different authors and use them to introduce the particular concepts. The main cloud models (SaaS, PaaS, IaaS), cloud types (public, private, hybrid), cloud standards and security concerns will be presented. The borders between Cloud Computing and Grid Computing, Server Virtualization, Utility Computing will be discussed and analyzed.

  11. Semiotics, Information Science, Documents and Computers.

    Science.gov (United States)

    Warner, Julian

    1990-01-01

    Discusses the relationship and value of semiotics to the established domains of information science. Highlights include documentation; computer operations; the language of computing; automata theory; linguistics; speech and writing; and the written language as a unifying principle for the document and the computer. (93 references) (LRW)

  12. Transcript structure and domain display: a customizable transcript visualization tool.

    Science.gov (United States)

    Watanabe, Kenneth A; Ma, Kaiwang; Homayouni, Arielle; Rushton, Paul J; Shen, Qingxi J

    2016-07-01

    Transcript Structure and Domain Display (TSDD) is a publicly available, web-based program that provides publication quality images of transcript structures and domains. TSDD is capable of producing transcript structures from GFF/GFF3 and BED files. Alternatively, the GFF files of several model organisms have been pre-loaded so that users only needs to enter the locus IDs of the transcripts to be displayed. Visualization of transcripts provides many benefits to researchers, ranging from evolutionary analysis of DNA-binding domains to predictive function modeling. TSDD is freely available for non-commercial users at http://shenlab.sols.unlv.edu/shenlab/software/TSD/transcript_display.html : jeffery.shen@unlv.nevada.edu. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. Domain decomposition methods for core calculations using the MINOS solver

    International Nuclear Information System (INIS)

    Guerin, P.; Baudron, A. M.; Lautard, J. J.

    2007-01-01

    Cell by cell homogenized transport calculations of an entire nuclear reactor core are currently too expensive for industrial applications, even if a simplified transport (SPn) approximation is used. In order to take advantage of parallel computers, we propose here two domain decomposition methods using the mixed dual finite element solver MINOS. The first one is a modal synthesis method on overlapping sub-domains: several Eigenmodes solutions of a local problem on each sub-domain are taken as basis functions used for the resolution of the global problem on the whole domain. The second one is an iterative method based on non-overlapping domain decomposition with Robin interface conditions. At each iteration, we solve the problem on each sub-domain with the interface conditions given by the solutions on the close sub-domains estimated at the previous iteration. For these two methods, we give numerical results which demonstrate their accuracy and their efficiency for the diffusion model on realistic 2D and 3D cores. (authors)

  14. The domain theory: patterns for knowledge and software reuse

    National Research Council Canada - National Science Library

    Sutcliffe, Alistair

    2002-01-01

    ..., retrieval system, or any other means, without prior written permission of the publisher. Lawrence Erlbaum Associates, Inc., Publishers 10 Industrial Avenue Mahwah, New Jersey 07430 Library of Congress Cataloging-in-Publication Data Sutcliffe, Alistair, 1951- The domain theory : patterns for knowledge and software reuse / Alistair Sutcl...

  15. Public Computer Usage in Chapel Hill Public Library

    Data.gov (United States)

    Town of Chapel Hill, North Carolina — Data collected November 2014 - May 2016. As of June 2016, this data is no longer collected on a continual basis.This dataset includes frequency and length of use of...

  16. Workshop on Computational Optimization

    CERN Document Server

    2016-01-01

    This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization 2014, held at Warsaw, Poland, September 7-10, 2014. The book presents recent advances in computational optimization. The volume includes important real problems like parameter settings for controlling processes in bioreactor and other processes, resource constrained project scheduling, infection distribution, molecule distance geometry, quantum computing, real-time management and optimal control, bin packing, medical image processing, localization the abrupt atmospheric contamination source and so on. It shows how to develop algorithms for them based on new metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming and others. This research demonstrates how some real-world problems arising in engineering, economics, medicine and other domains can be formulated as optimization tasks.

  17. GPU computing and applications

    CERN Document Server

    See, Simon

    2015-01-01

    This book presents a collection of state of the art research on GPU Computing and Application. The major part of this book is selected from the work presented at the 2013 Symposium on GPU Computing and Applications held in Nanyang Technological University, Singapore (Oct 9, 2013). Three major domains of GPU application are covered in the book including (1) Engineering design and simulation; (2) Biomedical Sciences; and (3) Interactive & Digital Media. The book also addresses the fundamental issues in GPU computing with a focus on big data processing. Researchers and developers in GPU Computing and Applications will benefit from this book. Training professionals and educators can also benefit from this book to learn the possible application of GPU technology in various areas.

  18. Cross-Sectional Associations between Home Environmental Factors and Domain-Specific Sedentary Behaviors in Adults: The Moderating Role of Socio-Demographic Variables and BMI

    Science.gov (United States)

    Busschaert, Cedric; Cardon, Greet; Chastin, Sebastien F. M.; Van Cauwenberg, Jelle; De Cocker, Katrien

    2017-01-01

    Despite the negative health effects of too much sitting, the majority of adults are too sedentary. To develop effective interventions, insight is needed into home environmental correlates of adults’ sedentary behaviors, and into the susceptibility of population subgroups to these home environmental cues. In total, 559 Flemish adults reported socio-demographics, weight and height, home environmental factors and domain-specific sedentary behaviors. Generalized linear modeling was conducted to examine main associations between home environmental factors and domain-specific sedentary behaviors, and to test the moderating role of socio-demographics and BMI on these associations. In case of significant interactions, stratified analyses were performed. Results showed that, among those who did use a computer/laptop during the last week, a one-unit increase in the number of computers or laptops was associated with 17% (OR = 1.17; 95% CI = 1.02, 1.34) and 24% (OR = 1.24; 95% CI = 1.08, 1.43) more minutes computer time per day, respectively. The proximity of the remote controller (p moderated by BMI, with significant positive associations limited to those not overweight. To conclude, home environmental factors were associated with domain-specific sedentary behaviors, especially in healthy weight adults. If confirmed by longitudinal studies, public health professionals should encourage adults to limit the number of indoor entertainment devices and motorized vehicles. PMID:29088089

  19. Essentials of Computational Electromagnetics

    CERN Document Server

    Sheng, Xin-Qing

    2012-01-01

    Essentials of Computational Electromagnetics provides an in-depth introduction of the three main full-wave numerical methods in computational electromagnetics (CEM); namely, the method of moment (MoM), the finite element method (FEM), and the finite-difference time-domain (FDTD) method. Numerous monographs can be found addressing one of the above three methods. However, few give a broad general overview of essentials embodied in these methods, or were published too early to include recent advances. Furthermore, many existing monographs only present the final numerical results without specifyin

  20. CLOUD COMPUTING SECURITY ISSUES

    OpenAIRE

    Florin OGIGAU-NEAMTIU

    2012-01-01

    The term “cloud computing” has been in the spotlights of IT specialists the last years because of its potential to transform this industry. The promised benefits have determined companies to invest great sums of money in researching and developing this domain and great steps have been made towards implementing this technology. Managers have traditionally viewed IT as difficult and expensive and the promise of cloud computing leads many to think that IT will now be easy and cheap. The reality ...

  1. Topology Based Domain Search (TBDS)

    National Research Council Canada - National Science Library

    Manning, William

    2002-01-01

    This effort will explore radical changes in the way Domain Name System (DNS) is used by endpoints in a network to improve the resilience of the endpoint and its applications in the face of dynamically changing infrastructure topology...

  2. Domain Discretization and Circle Packings

    DEFF Research Database (Denmark)

    Dias, Kealey

    A circle packing is a configuration of circles which are tangent with one another in a prescribed pattern determined by a combinatorial triangulation, where the configuration fills a planar domain or a two-dimensional surface. The vertices in the triangulation correspond to centers of circles...... to domain discretization problems such as triangulation and unstructured mesh generation techniques. We wish to ask ourselves the question: given a cloud of points in the plane (we restrict ourselves to planar domains), is it possible to construct a circle packing preserving the positions of the vertices...... and constrained meshes having predefined vertices as constraints. A standard method of two-dimensional mesh generation involves conformal mapping of the surface or domain to standardized shapes, such as a disk. Since circle packing is a new technique for constructing discrete conformal mappings, it is possible...

  3. Heliborne time domain electromagnetic system

    International Nuclear Information System (INIS)

    Bhattacharya, S.

    2009-01-01

    Atomic Minerals Directorate (AMD), are using heliborne and ground time domain electromagnetic (TDEM) system for the exploration of deep seated unconformity type uranium deposits. Uranium has been explored in various parts of the world like Athabasca basin using time domain electromagnetic system. AMD has identified some areas in India where such deposits are available. Apart from uranium exploration, the TDEM systems are used for the exploration of deep seated minerals like diamonds. Bhabha Atomic Research Centre (BARC) is involved in the indigenous design of the heliborne time domain system since this system is useful for DAE and also it has a scope of wide application. In this paper we discuss about the principle of time domain electromagnetic systems, their capabilities and the development and problems of such system for various other mineral exploration. (author)

  4. Anisotropy of domain wall resistance

    Science.gov (United States)

    Viret; Samson; Warin; Marty; Ott; Sondergard; Klein; Fermon

    2000-10-30

    The resistive effect of domain walls in FePd films with perpendicular anisotropy was studied experimentally as a function of field and temperature. The films were grown directly on MgO substrates, which induces an unusual virgin magnetic configuration composed of 60 nm wide parallel stripe domains. This allowed us to carry out the first measurements of the anisotropy of domain wall resistivity in the two configurations of current perpendicular and parallel to the walls. At 18 K, we find 8.2% and 1.3% for the domain wall magnetoresistance normalized to the wall width (8 nm) in these two respective configurations. These values are consistent with the predictions of Levy and Zhang.

  5. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  6. Time and frequency domain analyses of the Hualien Large-Scale Seismic Test

    International Nuclear Information System (INIS)

    Kabanda, John; Kwon, Oh-Sung; Kwon, Gunup

    2015-01-01

    Highlights: • Time- and frequency-domain analysis methods are verified against each other. • The two analysis methods are validated against Hualien LSST. • The nonlinear time domain (NLTD) analysis resulted in more realistic response. • The frequency domain (FD) analysis shows amplification at resonant frequencies. • The NLTD analysis requires significant modeling and computing time. - Abstract: In the nuclear industry, the equivalent-linear frequency domain analysis method has been the de facto standard procedure primarily due to the method's computational efficiency. This study explores the feasibility of applying the nonlinear time domain analysis method for the soil–structure-interaction analysis of nuclear power facilities. As a first step, the equivalency of the time and frequency domain analysis methods is verified through a site response analysis of one-dimensional soil, a dynamic impedance analysis of soil–foundation system, and a seismic response analysis of the entire soil–structure system. For the verifications, an idealized elastic soil–structure system is used to minimize variables in the comparison of the two methods. Then, the verified analysis methods are used to develop time and frequency domain models of Hualien Large-Scale Seismic Test. The predicted structural responses are compared against field measurements. The models are also analyzed with an amplified ground motion to evaluate discrepancies of the time and frequency domain analysis methods when the soil–structure system behaves beyond the elastic range. The analysis results show that the equivalent-linear frequency domain analysis method amplifies certain frequency bands and tends to result in higher structural acceleration than the nonlinear time domain analysis method. A comparison with field measurements shows that the nonlinear time domain analysis method better captures the frequency distribution of recorded structural responses than the frequency domain

  7. Maneuver from the Air Domain

    Science.gov (United States)

    2016-05-26

    Overload From the previous discussion, cognitive maneuver seeks to degrade the enemy’s capacity for...in all domains, the ability to maneuver from the air domain in the cognitive sense, comes primarily from air power’s unique ability to overload the... cognitive maneuver mechanisms developed in the 1980s as part of broader maneuver warfare theory. The result is a proposed definition of maneuver from

  8. Ferroelectric Negative Capacitance Domain Dynamics

    OpenAIRE

    Hoffmann, Michael; Khan, Asif Islam; Serrao, Claudy; Lu, Zhongyuan; Salahuddin, Sayeef; Pešić, Milan; Slesazeck, Stefan; Schroeder, Uwe; Mikolajick, Thomas

    2017-01-01

    Transient negative capacitance effects in epitaxial ferroelectric Pb(Zr$_{0.2}$Ti$_{0.8}$)O$_3$ capacitors are investigated with a focus on the dynamical switching behavior governed by domain nucleation and growth. Voltage pulses are applied to a series connection of the ferroelectric capacitor and a resistor to directly measure the ferroelectric negative capacitance during switching. A time-dependent Ginzburg-Landau approach is used to investigate the underlying domain dynamics. The transien...

  9. Gravity and domain wall problem

    International Nuclear Information System (INIS)

    Rai, B.; Senjanovic, G.

    1992-11-01

    It is well known that the spontaneous breaking of discrete symmetries may lead to conflict with big-bang cosmology. This is due to formation of domain walls which give unacceptable contribution to the energy density of the universe. On the other hand, it is expected that gravity breaks global symmetries explicitly. In this work we propose that this could provide a natural solution to the domain-wall problem. (author). 17 refs

  10. Incompleteness in the finite domain

    Czech Academy of Sciences Publication Activity Database

    Pudlák, Pavel

    2017-01-01

    Roč. 23, č. 4 (2017), s. 405-441 ISSN 1079-8986 EU Projects: European Commission(XE) 339691 - FEALORA Institutional support: RVO:67985840 Keywords : finite domain Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.742, year: 2016 https://www.cambridge.org/core/journals/bulletin-of-symbolic-logic/article/incompleteness-in-the-finite-domain/D239B1761A73DCA534A4805A76D81C76

  11. EH domain of EHD1

    Energy Technology Data Exchange (ETDEWEB)

    Kieken, Fabien; Jovic, Marko; Naslavsky, Naava; Caplan, Steve, E-mail: scaplan@unmc.edu; Sorgen, Paul L. [University of Nebraska Medical Center, Department of Biochemistry and Molecular Biology and Eppley Cancer Center (United States)], E-mail: psorgen@unmc.edu

    2007-12-15

    EHD1 is a member of the mammalian C-terminal Eps15 homology domain (EH) containing protein family, and regulates the recycling of various receptors from the endocytic recycling compartment to the plasma membrane. The EH domain of EHD1 binds to proteins containing either an Asn-Pro-Phe or Asp-Pro-Phe motif, and plays an important role in the subcellular localization and function of EHD1. Thus far, the structures of five N-terminal EH domains from other proteins have been solved, but to date, the structure of the EH domains from the four C-terminal EHD family paralogs remains unknown. In this study, we have assigned the 133 C-terminal residues of EHD1, which includes the EH domain, and solved its solution structure. While the overall structure resembles that of the second of the three N-terminal Eps15 EH domains, potentially significant differences in surface charge and the structure of the tripeptide-binding pocket are discussed.

  12. EH domain of EHD1

    International Nuclear Information System (INIS)

    Kieken, Fabien; Jovic, Marko; Naslavsky, Naava; Caplan, Steve; Sorgen, Paul L.

    2007-01-01

    EHD1 is a member of the mammalian C-terminal Eps15 homology domain (EH) containing protein family, and regulates the recycling of various receptors from the endocytic recycling compartment to the plasma membrane. The EH domain of EHD1 binds to proteins containing either an Asn-Pro-Phe or Asp-Pro-Phe motif, and plays an important role in the subcellular localization and function of EHD1. Thus far, the structures of five N-terminal EH domains from other proteins have been solved, but to date, the structure of the EH domains from the four C-terminal EHD family paralogs remains unknown. In this study, we have assigned the 133 C-terminal residues of EHD1, which includes the EH domain, and solved its solution structure. While the overall structure resembles that of the second of the three N-terminal Eps15 EH domains, potentially significant differences in surface charge and the structure of the tripeptide-binding pocket are discussed

  13. A hybrid time-domain discontinuous galerkin-boundary integral method for electromagnetic scattering analysis

    KAUST Repository

    Li, Ping; Shi, Yifei; Jiang, Lijun; Bagci, Hakan

    2014-01-01

    A scheme hybridizing discontinuous Galerkin time-domain (DGTD) and time-domain boundary integral (TDBI) methods for accurately analyzing transient electromagnetic scattering is proposed. Radiation condition is enforced using the numerical flux on the truncation boundary. The fields required by the flux are computed using the TDBI from equivalent currents introduced on a Huygens' surface enclosing the scatterer. The hybrid DGTDBI ensures that the radiation condition is mathematically exact and the resulting computation domain is as small as possible since the truncation boundary conforms to scatterer's shape and is located very close to its surface. Locally truncated domains can also be defined around each disconnected scatterer additionally reducing the size of the overall computation domain. Numerical examples demonstrating the accuracy and versatility of the proposed method are presented. © 2014 IEEE.

  14. A hybrid time-domain discontinuous galerkin-boundary integral method for electromagnetic scattering analysis

    KAUST Repository

    Li, Ping

    2014-05-01

    A scheme hybridizing discontinuous Galerkin time-domain (DGTD) and time-domain boundary integral (TDBI) methods for accurately analyzing transient electromagnetic scattering is proposed. Radiation condition is enforced using the numerical flux on the truncation boundary. The fields required by the flux are computed using the TDBI from equivalent currents introduced on a Huygens\\' surface enclosing the scatterer. The hybrid DGTDBI ensures that the radiation condition is mathematically exact and the resulting computation domain is as small as possible since the truncation boundary conforms to scatterer\\'s shape and is located very close to its surface. Locally truncated domains can also be defined around each disconnected scatterer additionally reducing the size of the overall computation domain. Numerical examples demonstrating the accuracy and versatility of the proposed method are presented. © 2014 IEEE.

  15. Computer Security: Working privately in public

    CERN Multimedia

    Stefan Lueders, Computer Security Team

    2014-01-01

    Gosh, was he annoyed! I just came back from a long duty trip. Nine hours straight on the plane. As usual plenty of time to get some long awaited emails answered, time to write another document, and to prepare some presentations. The guy sitting next to me was probably thinking the same. So, from time to time I gazed over and looked at his screen following what he was working on. Curiosity is part of my job. Laptop screens are attractive. Discretion is part of my job, too. But given the confined space in the economy class of an Airbus, the screen was just shining at me and he was not able to move away or reposition his screen… He seemed to feel increasingly uncomfortable. Consequently, he gave up and read the newspaper instead. Obviously annoyed. He could have protected himself better...   Has this also happened to you? On the plane? On the train? In a restaurant? Or even in a conference or seminar? Do you care? If you do, what about clipping a “privacy screen” o...

  16. Public Websites and Human–computer Interaction

    DEFF Research Database (Denmark)

    Sørum, Hanne; Andersen, Kim Normann; Vatrapu, Ravi

    2012-01-01

    system use by representatives. A Pearson correlation analysis of user evaluation from 296 websites that participated in the Danish web award Bedst på Nettet (‘Top of the Web’) showed no significant positive correlation between website quality and user satisfaction. We put forward recommendations...... for further investigation: (1) inclusion of real users (citizens and businesses) in real-use setting in the evaluation process could help move forward the understanding of the relationship between website quality and end-user satisfaction; (2) the lack of correlation between website quality and user...

  17. Wavefield extrapolation in pseudodepth domain

    KAUST Repository

    Ma, Xuxin; Alkhalifah, Tariq Ali

    2013-01-01

    Wavefields are commonly computed in the Cartesian coordinate frame. Its efficiency is inherently limited due to spatial oversampling in deep layers, where the velocity is high and wavelengths are long. To alleviate this computational waste due

  18. Computer games and software engineering

    CERN Document Server

    Cooper, Kendra M L

    2015-01-01

    Computer games represent a significant software application domain for innovative research in software engineering techniques and technologies. Game developers, whether focusing on entertainment-market opportunities or game-based applications in non-entertainment domains, thus share a common interest with software engineers and developers on how to best engineer game software.Featuring contributions from leading experts in software engineering, the book provides a comprehensive introduction to computer game software development that includes its history as well as emerging research on the inte

  19. Domain-to-domain coupling in voltage-sensing phosphatase.

    Science.gov (United States)

    Sakata, Souhei; Matsuda, Makoto; Kawanabe, Akira; Okamura, Yasushi

    2017-01-01

    Voltage-sensing phosphatase (VSP) consists of a transmembrane voltage sensor and a cytoplasmic enzyme region. The enzyme region contains the phosphatase and C2 domains, is structurally similar to the tumor suppressor phosphatase PTEN, and catalyzes the dephosphorylation of phosphoinositides. The transmembrane voltage sensor is connected to the phosphatase through a short linker region, and phosphatase activity is induced upon membrane depolarization. Although the detailed molecular characteristics of the voltage sensor domain and the enzyme region have been revealed, little is known how these two regions are coupled. In addition, it is important to know whether mechanism for coupling between the voltage sensor domain and downstream effector function is shared among other voltage sensor domain-containing proteins. Recent studies in which specific amino acid sites were genetically labeled using a fluorescent unnatural amino acid have enabled detection of the local structural changes in the cytoplasmic region of Ciona intestinalis VSP that occur with a change in membrane potential. The results of those studies provide novel insight into how the enzyme activity of the cytoplasmic region of VSP is regulated by the voltage sensor domain.

  20. Effective Domain Partitioning for Multi-Clock Domain IP Core Wrapper Design under Power Constraints

    Science.gov (United States)

    Yu, Thomas Edison; Yoneda, Tomokazu; Zhao, Danella; Fujiwara, Hideo

    The rapid advancement of VLSI technology has made it possible for chip designers and manufacturers to embed the components of a whole system onto a single chip, called System-on-Chip or SoC. SoCs make use of pre-designed modules, called IP-cores, which provide faster design time and quicker time-to-market. Furthermore, SoCs that operate at multiple clock domains and very low power requirements are being utilized in the latest communications, networking and signal processing devices. As a result, the testing of SoCs and multi-clock domain embedded cores under power constraints has been rapidly gaining importance. In this research, a novel method for designing power-aware test wrappers for embedded cores with multiple clock domains is presented. By effectively partitioning the various clock domains, we are able to increase the solution space of possible test schedules for the core. Since previous methods were limited to concurrently testing all the clock domains, we effectively remove this limitation by making use of bandwidth conversion, multiple shift frequencies and properly gating the clock signals to control the shift activity of various core logic elements. The combination of the above techniques gains us greater flexibility when determining an optimal test schedule under very tight power constraints. Furthermore, since it is computationally intensive to search the entire expanded solution space for the possible test schedules, we propose a heuristic 3-D bin packing algorithm to determine the optimal wrapper architecture and test schedule while minimizing the test time under power and bandwidth constraints.