WorldWideScience

Sample records for public domain computer

  1. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Science.gov (United States)

    2010-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... GENERAL PROVISIONS § 201.26 Recordation of documents pertaining to computer shareware and donation of public domain computer software. (a) General. This section prescribes the procedures for submission of...

  2. Developing a personal computer-based data visualization system using public domain software

    Science.gov (United States)

    Chen, Philip C.

    1999-03-01

    The current research will investigate the possibility of developing a computing-visualization system using a public domain software system built on a personal computer. Visualization Toolkit (VTK) is available on UNIX and PC platforms. VTK uses C++ to build an executable. It has abundant programming classes/objects that are contained in the system library. Users can also develop their own classes/objects in addition to those existing in the class library. Users can develop applications with any of the C++, Tcl/Tk, and JAVA environments. The present research will show how a data visualization system can be developed with VTK running on a personal computer. The topics will include: execution efficiency; visual object quality; availability of the user interface design; and exploring the feasibility of the VTK-based World Wide Web data visualization system. The present research will feature a case study showing how to use VTK to visualize meteorological data with techniques including, iso-surface, volume rendering, vector display, and composite analysis. The study also shows how the VTK outline, axes, and two-dimensional annotation text and title are enhancing the data presentation. The present research will also demonstrate how VTK works in an internet environment while accessing an executable with a JAVA application programing in a webpage.

  3. PUBLIC DOMAIN PROTECTION. USES AND REUSES OF PUBLIC DOMAIN WORKS

    OpenAIRE

    Monica Adriana LUPAȘCU

    2015-01-01

    This study tries to highlight the necessity of an awareness of the right of access to the public domain, particularly using the example of works whose protection period has expired, as well as the ones which the law considers to be excluded from protection. Such works are used not only by large libraries from around the world, but also by rights holders, via different means of use, including incorporations into original works or adaptations. However, the reuse that follows these uses often on...

  4. Public Computation & Boundary Play

    CERN Document Server

    Sengupta, Pratim

    2016-01-01

    In this paper, we introduce 'public computation' as a genre of learning environments that can be used to radically broaden public participation in authentic, computation-enabled STEM disciplinary practices. Our paradigmatic approach utilizes open source software designed for professional scientists, engineers and digital artists, and situates them in an undiluted form, alongside live and archived expert support, in a public space. We present a case study of DigiPlay, a prototypical public computation space we designed at the University of Calgary, where users can interact directly with scientific simulations as well as the underlying open source code using an array of massive multi- touch screens. We argue that in such a space, public interactions with the code can be thought of as boundary work and play, through which public participation becomes legitimate scientific act, as the public engages in scientific creation through truly open-ended explorations with the code.

  5. A Domain-Specific Programming Language for Secure Multiparty Computation

    DEFF Research Database (Denmark)

    Nielsen, Janus Dam; Schwartzbach, Michael Ignatieff

    2007-01-01

    We present a domain-specific programming language for Secure Multiparty Computation (SMC). Information is a resource of vital importance and considerable economic value to individuals, public administration, and private companies. This means that the confidentiality of information is crucial...... application development. The language is implemented in a prototype compiler that generates Java code exploiting a distributed cryptographic runtime....

  6. Publication-quality computer graphics

    Energy Technology Data Exchange (ETDEWEB)

    Slabbekorn, M.H.; Johnston, R.B. Jr.

    1981-01-01

    A user-friendly graphic software package is being used at Oak Ridge National Laboratory to produce publication-quality computer graphics. Close interaction between the graphic designer and computer programmer have helped to create a highly flexible computer graphics system. The programmer-oriented environment of computer graphics has been modified to allow the graphic designer freedom to exercise his expertise with lines, form, typography, and color. The resultant product rivals or surpasses that work previously done by hand. This presentation of computer-generated graphs, charts, diagrams, and line drawings clearly demonstrates the latitude and versatility of the software when directed by a graphic designer.

  7. Domain decomposition algorithms and computational fluid dynamics

    Science.gov (United States)

    Chan, Tony F.

    1988-01-01

    Some of the new domain decomposition algorithms are applied to two model problems in computational fluid dynamics: the two-dimensional convection-diffusion problem and the incompressible driven cavity flow problem. First, a brief introduction to the various approaches of domain decomposition is given, and a survey of domain decomposition preconditioners for the operator on the interface separating the subdomains is then presented. For the convection-diffusion problem, the effect of the convection term and its discretization on the performance of some of the preconditioners is discussed. For the driven cavity problem, the effectiveness of a class of boundary probe preconditioners is examined.

  8. Cultural Heritage and the Public Domain

    Directory of Open Access Journals (Sweden)

    Bas Savenije

    2012-09-01

    by providing their resources on the Internet” (Berlin Declaration 2003. Therefore, in the spirit of the Berlin Declaration, the ARL encourages its members’ libraries to grant all non-commercial users “a free, irrevocable, worldwide, right of access to, and a license to copy, use, distribute, transmit and display the work publicly and to make and distribute derivative works, in any digital medium for any responsible purpose, subject to proper attribution of authorship”. And: “If fees are to be assessed for the use of digitised public domain works, those fees should only apply to commercial uses” (ARL Principles July 2010. In our view, cultural heritage institutions should make public domain material digitised with public funding as widely available as possible for access and reuse. The public sector has the primary responsibility to fund digitisation. The involvement of private partners, however, is encouraged by ARL as well as the Comité des Sages. Private funding for digitisation is a complement to the necessary public investment, especially in times of economic crisis, but should not be seen as a substitute for public funding. As we can see from these reports there are a number of arguments in favour of digitisation and also of providing maximum accessibility to the digitised cultural heritage. In this paper we will investigate the legal aspects of digitisation of cultural heritage, especially public domain material. On the basis of these we will make an inventory of policy considerations regarding reuse. Furthermore, we will describe the conclusions the National Library of the Netherlands (hereafter: KB has formulated and the arguments that support these. In this context we will review public-private partnerships and also the policy of the KB. We will conclude with recommendations for cultural heritage institutions concerning a reuse policy for digitised public domain material.

  9. Computer Science and Technology Publications. NBS Publications List 84.

    Science.gov (United States)

    National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.

    This bibliography lists publications of the Institute for Computer Sciences and Technology of the National Bureau of Standards. Publications are listed by subject in the areas of computer security, computer networking, and automation technology. Sections list publications of: (1) current Federal Information Processing Standards; (2) computer…

  10. Preserving the positive functions of the public domain in science

    Directory of Open Access Journals (Sweden)

    Pamela Samuelson

    2003-11-01

    Full Text Available Science has advanced in part because data and scientific methodologies have traditionally not been subject to intellectual property protection. In recent years, intellectual property has played a greater role in scientific work. While intellectual property rights may have a positive role to play in some fields of science, so does the public domain. This paper will discuss some of the positive functions of the public domain and ways in which certain legal developments may negatively impact the public domain. It suggests some steps that scientists can take to preserve the positive functions of the public domain for science.

  11. The Definition, Dimensions, and Domain of Public Relations.

    Science.gov (United States)

    Hutton, James G.

    1999-01-01

    Discusses how the field of public relations has left itself vulnerable to other fields that are making inroads into public relations' traditional domain, and to critics who are filling in their own definitions of public relations. Proposes a definition and a three-dimensional framework to compare competing philosophies of public relations and to…

  12. Domain decomposition algorithms and computation fluid dynamics

    Science.gov (United States)

    Chan, Tony F.

    1988-01-01

    In the past several years, domain decomposition was a very popular topic, partly motivated by the potential of parallelization. While a large body of theory and algorithms were developed for model elliptic problems, they are only recently starting to be tested on realistic applications. The application of some of these methods to two model problems in computational fluid dynamics are investigated. Some examples are two dimensional convection-diffusion problems and the incompressible driven cavity flow problem. The construction and analysis of efficient preconditioners for the interface operator to be used in the iterative solution of the interface solution is described. For the convection-diffusion problems, the effect of the convection term and its discretization on the performance of some of the preconditioners is discussed. For the driven cavity problem, the effectiveness of a class of boundary probe preconditioners is discussed.

  13. Computations of Bergman Kernels on Hua Domains

    Institute of Scientific and Technical Information of China (English)

    殷慰萍; 王安; 赵振刚; 赵晓霞; 管冰辛

    2001-01-01

    @@The Bergman kernel function plays an important ro1e in several complex variables.There exists the Bergman kernel function on any bounded domain in Cn. But we can get the Bergman kernel functions in explicit formulas for a few types of domains only,for example:the bounded homogeneous domains and the egg domain in some cases.

  14. Human-computer interface incorporating personal and application domains

    Science.gov (United States)

    Anderson, Thomas G.

    2011-03-29

    The present invention provides a human-computer interface. The interface includes provision of an application domain, for example corresponding to a three-dimensional application. The user is allowed to navigate and interact with the application domain. The interface also includes a personal domain, offering the user controls and interaction distinct from the application domain. The separation into two domains allows the most suitable interface methods in each: for example, three-dimensional navigation in the application domain, and two- or three-dimensional controls in the personal domain. Transitions between the application domain and the personal domain are under control of the user, and the transition method is substantially independent of the navigation in the application domain. For example, the user can fly through a three-dimensional application domain, and always move to the personal domain by moving a cursor near one extreme of the display.

  15. Computational thinking as an emerging competence domain

    NARCIS (Netherlands)

    Yadav, A.; Good, J.; Voogt, J.; Fisser, P.; Mulder, M.

    2016-01-01

    Computational thinking is a problem-solving skill set, which includes problem decomposition, algorithmic thinking, abstraction, and automation. Even though computational thinking draws upon concepts fundamental to computer science (CS), it has broad application to all disciplines. It has been

  16. Time-Domain Computation Of Electromagnetic Fields In MMICs

    Science.gov (United States)

    Lansing, Faiza S.; Rascoe, Daniel L.

    1995-01-01

    Maxwell's equations solved on three-dimensional, conformed orthogonal grids by finite-difference techniques. Method of computing frequency-dependent electrical parameters of monolithic microwave integrated circuit (MMIC) involves time-domain computation of propagation of electromagnetic field in response to excitation by single pulse at input terminal, followed by computation of Fourier transforms to obtain frequency-domain response from time-domain response. Parameters computed include electric and magnetic fields, voltages, currents, impedances, scattering parameters, and effective dielectric constants. Powerful and efficient means for analyzing performance of even complicated MMIC.

  17. Domain Decomposition Based High Performance Parallel Computing

    CERN Document Server

    Raju, Mandhapati P

    2009-01-01

    The study deals with the parallelization of finite element based Navier-Stokes codes using domain decomposition and state-ofart sparse direct solvers. There has been significant improvement in the performance of sparse direct solvers. Parallel sparse direct solvers are not found to exhibit good scalability. Hence, the parallelization of sparse direct solvers is done using domain decomposition techniques. A highly efficient sparse direct solver PARDISO is used in this study. The scalability of both Newton and modified Newton algorithms are tested.

  18. Computational thinking as an emerging competence domain

    NARCIS (Netherlands)

    Yadav, A.; Good, J.; Voogt, J.; Fisser, P.; Mulder, M.

    2016-01-01

    Computational thinking is a problem-solving skill set, which includes problem decomposition, algorithmic thinking, abstraction, and automation. Even though computational thinking draws upon concepts fundamental to computer science (CS), it has broad application to all disciplines. It has been sugges

  19. Computational thinking as an emerging competence domain

    NARCIS (Netherlands)

    Yadav, A.; Good, J.; Voogt, J.; Fisser, P.; Mulder, M.

    2016-01-01

    Computational thinking is a problem-solving skill set, which includes problem decomposition, algorithmic thinking, abstraction, and automation. Even though computational thinking draws upon concepts fundamental to computer science (CS), it has broad application to all disciplines. It has been sugges

  20. Time Domain Terahertz Axial Computed Tomography Non Destructive Evaluation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to demonstrate key elements of feasibility for a high speed automated time domain terahertz computed axial tomography (TD-THz CT) non destructive...

  1. Time Domain Terahertz Axial Computed Tomography Non Destructive Evaluation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this Phase 2 project, we propose to develop, construct, and deliver to NASA a computed axial tomography time-domain terahertz (CT TD-THz) non destructive...

  2. Overview of Center for Domain-Specific Computing

    Institute of Scientific and Technical Information of China (English)

    Jason Cong

    2011-01-01

    In this short article,we would like to introduce the Center for Domain-Specific Computing (CDSC) established in 2009,primarily funded by the US National Science Foundation with an award from the 2009 Expeditions in Computing Program.In this project we look beyond parallelization and focus on customization as the next disruptive technology to bring orders-of-magnitude power-performance efficiency improvement for applications in a specific domain.

  3. Public-domain software for root image analysis

    Directory of Open Access Journals (Sweden)

    Mirian Cristina Gomes Costa

    2014-10-01

    Full Text Available In the search for high efficiency in root studies, computational systems have been developed to analyze digital images. ImageJ and Safira are public-domain systems that may be used for image analysis of washed roots. However, differences in root properties measured using ImageJ and Safira are supposed. This study compared values of root length and surface area obtained with public-domain systems with values obtained by a reference method. Root samples were collected in a banana plantation in an area of a shallower Typic Carbonatic Haplic Cambisol (CXk, and an area of a deeper Typic Haplic Ta Eutrophic Cambisol (CXve, at six depths in five replications. Root images were digitized and the systems ImageJ and Safira used to determine root length and surface area. The line-intersect method modified by Tennant was used as reference; values of root length and surface area measured with the different systems were analyzed by Pearson's correlation coefficient and compared by the confidence interval and t-test. Both systems ImageJ and Safira had positive correlation coefficients with the reference method for root length and surface area data in CXk and CXve. The correlation coefficient ranged from 0.54 to 0.80, with lowest value observed for ImageJ in the measurement of surface area of roots sampled in CXve. The IC (95 % revealed that root length measurements with Safira did not differ from that with the reference method in CXk (-77.3 to 244.0 mm. Regarding surface area measurements, Safira did not differ from the reference method for samples collected in CXk (-530.6 to 565.8 mm² as well as in CXve (-4231 to 612.1 mm². However, measurements with ImageJ were different from those obtained by the reference method, underestimating length and surface area in samples collected in CXk and CXve. Both ImageJ and Safira allow an identification of increases or decreases in root length and surface area. However, Safira results for root length and surface area are

  4. Development and Initial Validation of Public Domain Basic Interest Markers

    Science.gov (United States)

    Liao, Hsin-Ya; Armstrong, Patrick Ian; Rounds, James

    2008-01-01

    Goldberg (Goldberg, L. R. (1999). "A broad-bandwidth, public-domain, personality inventory measuring the lower-level facets of several five-factor models." In: I. Mervielde, I. Deary, F. De Fruyt, & F. Ostendorf (Eds.), "Personality psychology in Europe" (Vol. 7, pp. 7-28). Tilburg, The Netherlands: Tilburg University Press) has argued that the…

  5. Assessment of current cybersecurity practices in the public domain : cyber indications and warnings domain.

    Energy Technology Data Exchange (ETDEWEB)

    Hamlet, Jason R.; Keliiaa, Curtis M.

    2010-09-01

    This report assesses current public domain cyber security practices with respect to cyber indications and warnings. It describes cybersecurity industry and government activities, including cybersecurity tools, methods, practices, and international and government-wide initiatives known to be impacting current practice. Of particular note are the U.S. Government's Trusted Internet Connection (TIC) and 'Einstein' programs, which are serving to consolidate the Government's internet access points and to provide some capability to monitor and mitigate cyber attacks. Next, this report catalogs activities undertaken by various industry and government entities. In addition, it assesses the benchmarks of HPC capability and other HPC attributes that may lend themselves to assist in the solution of this problem. This report draws few conclusions, as it is intended to assess current practice in preparation for future work, however, no explicit references to HPC usage for the purpose of analyzing cyber infrastructure in near-real-time were found in the current practice. This report and a related SAND2010-4766 National Cyber Defense High Performance Computing and Analysis: Concepts, Planning and Roadmap report are intended to provoke discussion throughout a broad audience about developing a cohesive HPC centric solution to wide-area cybersecurity problems.

  6. Bringing computational science to the public.

    Science.gov (United States)

    McDonagh, James L; Barker, Daniel; Alderson, Rosanna G

    2016-01-01

    The increasing use of computers in science allows for the scientific analyses of large datasets at an increasing pace. We provided examples and interactive demonstrations at Dundee Science Centre as part of the 2015 Women in Science festival, to present aspects of computational science to the general public. We used low-cost Raspberry Pi computers to provide hands on experience in computer programming and demonstrated the application of computers to biology. Computer games were used as a means to introduce computers to younger visitors. The success of the event was evaluated by voluntary feedback forms completed by visitors, in conjunction with our own self-evaluation. This work builds on the original work of the 4273π bioinformatics education program of Barker et al. (2013, BMC Bioinform. 14:243). 4273π provides open source education materials in bioinformatics. This work looks at the potential to adapt similar materials for public engagement events. It appears, at least in our small sample of visitors (n = 13), that basic computational science can be conveyed to people of all ages by means of interactive demonstrations. Children as young as five were able to successfully edit simple computer programs with supervision. This was, in many cases, their first experience of computer programming. The feedback is predominantly positive, showing strong support for improving computational science education, but also included suggestions for improvement. Our conclusions are necessarily preliminary. However, feedback forms suggest methods were generally well received among the participants; "Easy to follow. Clear explanation" and "Very easy. Demonstrators were very informative." Our event, held at a local Science Centre in Dundee, demonstrates that computer games and programming activities suitable for young children can be performed alongside a more specialised and applied introduction to computational science for older visitors.

  7. Copyright and the Value of the Public Domain

    OpenAIRE

    Erickson, Kristofer; Heald, Paul; Homberg, Fabian; Kretschmer, Martin; Mendis, Dinusha

    2015-01-01

    This research report documents the results of a year-long knowledge exchange initiative undertaken between the Intellectual Property Office, researchers at the University of Glasgow CREATe Centre, and more than two dozen UK businesses and innovators, to explore how value is generated from the public domain. The study was supported by the Economic and Social Research Council (ESRC) and the Intellectual Property Office (IPO). The core research team consisted of Dr. Kristofer Erickson (Lord Kelv...

  8. A Separated Domain-Based Kernel Model for Trusted Computing

    Institute of Scientific and Technical Information of China (English)

    FANG Yanxiang; SHEN Changxiang; XU Jingdong; WU Gongyi

    2006-01-01

    This paper fist gives an investigation on trusted computing on mainstream operation system (OS). Based on the observations, it is pointed out that Trusted Computing cannot be achieved due to the lack of separation mechanism of the components in mainstream OS. In order to provide a kind of separation mechanism, this paper proposes a separated domain-based kernel model (SDBKM), and this model is verified by non-interference theory. By monitoring and simplifying the trust dependence between domains, this model can solve problems in trust measurement such as deny of service (DoS) attack, Host security, and reduce the overhead of measurement.

  9. Advances in Domain Mapping of Massively Parallel Scientific Computations

    Energy Technology Data Exchange (ETDEWEB)

    Leland, Robert W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hendrickson, Bruce A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    One of the most important concerns in parallel computing is the proper distribution of workload across processors. For most scientific applications on massively parallel machines, the best approach to this distribution is to employ data parallelism; that is, to break the datastructures supporting a computation into pieces and then to assign those pieces to different processors. Collectively, these partitioning and assignment tasks comprise the domain mapping problem.

  10. Approximation method to compute domain related integrals in structural studies

    Science.gov (United States)

    Oanta, E.; Panait, C.; Raicu, A.; Barhalescu, M.; Axinte, T.

    2015-11-01

    Various engineering calculi use integral calculus in theoretical models, i.e. analytical and numerical models. For usual problems, integrals have mathematical exact solutions. If the domain of integration is complicated, there may be used several methods to calculate the integral. The first idea is to divide the domain in smaller sub-domains for which there are direct calculus relations, i.e. in strength of materials the bending moment may be computed in some discrete points using the graphical integration of the shear force diagram, which usually has a simple shape. Another example is in mathematics, where the surface of a subgraph may be approximated by a set of rectangles or trapezoids used to calculate the definite integral. The goal of the work is to introduce our studies about the calculus of the integrals in the transverse section domains, computer aided solutions and a generalizing method. The aim of our research is to create general computer based methods to execute the calculi in structural studies. Thus, we define a Boolean algebra which operates with ‘simple’ shape domains. This algebraic standpoint uses addition and subtraction, conditioned by the sign of every ‘simple’ shape (-1 for the shapes to be subtracted). By ‘simple’ shape or ‘basic’ shape we define either shapes for which there are direct calculus relations, or domains for which their frontiers are approximated by known functions and the according calculus is carried out using an algorithm. The ‘basic’ shapes are linked to the calculus of the most significant stresses in the section, refined aspect which needs special attention. Starting from this idea, in the libraries of ‘basic’ shapes, there were included rectangles, ellipses and domains whose frontiers are approximated by spline functions. The domain triangularization methods suggested that another ‘basic’ shape to be considered is the triangle. The subsequent phase was to deduce the exact relations for the

  11. Computation of Steady Incompressible Flows in Unbounded Domains

    CERN Document Server

    Gustafsson, Jonathan

    2014-01-01

    In this study we revisit the problem of computing steady Navier-Stokes flows in two-dimensional unbounded domains. Precise quantitative characterization of such flows in the high-Reynolds number limit remains an open problem of theoretical fluid dynamics. Following a review of key mathematical properties of such solutions related to the slow decay of the velocity field at large distances from the obstacle, we develop and carefully validate a spectrally-accurate computational approach which ensures the correct behavior of the solution at infinity. In the proposed method the numerical solution is defined on the entire unbounded domain without the need to truncate this domain to a finite box with some artificial boundary conditions prescribed at its boundaries. Since our approach relies on the streamfunction-vorticity formulation, the main complication is the presence of a discontinuity in the streamfunction field at infinity which is related to the slow decay of this field. We demonstrate how this difficulty ca...

  12. Public access computing in health science libraries.

    Science.gov (United States)

    Kehm, S

    1987-01-01

    Public access computing in health science libraries began with online computer-assisted instruction. Library-based collections and services have expanded with advances in microcomputing hardware and software. This growth presents problems: copyright, quality, instability in the publishing industry, and uncertainty about collection scope; librarians managing the new services require new skills to support their collections. Many find the cooperative efforts of several organizational units are required. Current trends in technology for the purpose of information management indicate that these services will continue to be a significant focus for libraries.

  13. Computational simulation of wave propagation problems in infinite domains

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    This paper deals with the computational simulation of both scalar wave and vector wave propagation problems in infinite domains. Due to its advantages in simulating complicated geometry and complex material properties, the finite element method is used to simulate the near field of a wave propagation problem involving an infinite domain. To avoid wave reflection and refraction at the common boundary between the near field and the far field of an infinite domain, we have to use some special treatments to this boundary. For a wave radiation problem, a wave absorbing boundary can be applied to the common boundary between the near field and the far field of an infinite domain, while for a wave scattering problem, the dynamic infinite element can be used to propagate the incident wave from the near field to the far field of the infinite domain. For the sake of illustrating how these two different approaches are used to simulate the effect of the far field, a mathematical expression for a wave absorbing boundary of high-order accuracy is derived from a two-dimensional scalar wave radiation problem in an infinite domain, while the detailed mathematical formulation of the dynamic infinite element is derived from a two-dimensional vector wave scattering problem in an infinite domain. Finally, the coupled method of finite elements and dynamic infinite elements is used to investigate the effects of topographical conditions on the free field motion along the surface of a canyon.

  14. Protecting Terminals by Security Domain Mechanism Based on Trusted Computing

    Institute of Scientific and Technical Information of China (English)

    ZHOU Zheng; ZHANG Jun; LI Jian; LIU Yi

    2006-01-01

    Networks are composed with servers and rather larger amounts of terminals and most menace of attack and virus come from terminals. Eliminating malicious code and access or breaking the conditions only under witch attack or virus can be invoked in those terminals would be the most effective way to protect information systems. The concept of trusted computing was first introduced into terminal virus immunity. Then a model of security domain mechanism based on trusted computing to protect computers from proposed from abstracting the general information systems. The principle of attack resistant and venture limitation of the model was demonstrated by means of mathematical analysis, and the realization of the model was proposed.

  15. Peer Review-Based Scripted Collaboration to Support Domain-Specific and Domain-General Knowledge Acquisition in Computer Science

    Science.gov (United States)

    Demetriadis, Stavros; Egerter, Tina; Hanisch, Frank; Fischer, Frank

    2011-01-01

    This study investigates the effectiveness of using peer review in the context of scripted collaboration to foster both domain-specific and domain-general knowledge acquisition in the computer science domain. Using a one-factor design with a script and a control condition, students worked in small groups on a series of computer science problems…

  16. Agents unleashed a public domain look at agent technology

    CERN Document Server

    Wayner, Peter

    1995-01-01

    Agents Unleashed: A Public Domain Look at Agent Technology covers details of building a secure agent realm. The book discusses the technology for creating seamlessly integrated networks that allow programs to move from machine to machine without leaving a trail of havoc; as well as the technical details of how an agent will move through the network, prove its identity, and execute its code without endangering the host. The text also describes the organization of the host's work processing an agent; error messages, bad agent expulsion, and errors in XLISP-agents; and the simulators of errors, f

  17. Modern EMC analysis I time-domain computational schemes

    CERN Document Server

    Kantartzis, Nikolaos V

    2008-01-01

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of contemporary real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, the analysis covers the theory of the finite-difference time-domain, the transmission-line matrix/modeling, and the finite i

  18. The Human-Computer Domain Relation in UX Models

    DEFF Research Database (Denmark)

    Clemmensen, Torkil

    This paper argues that the conceptualization of the human, the computer and the domain of use in competing lines of UX research have problematic similarities and superficial differences. The paper qualitatively analyses concepts and models in five research papers that together represent two...... influential lines of UX research: aesthetics and temporal UX, and two use situations: using a website and starting to use a smartphone. The results suggest that the two lines of UX research share a focus on users’ evaluative judgments of technology, both focuses on product qualities rather than activity...... domains, give little details about users, and treat human-computer interaction as perception. The conclusion gives similarities and differences between the approaches to UX. The implications for theory building are indicated....

  19. Public Websites and Human–computer Interaction

    DEFF Research Database (Denmark)

    Sørum, Hanne; Andersen, Kim Normann; Vatrapu, Ravi

    2012-01-01

    The focus of this paper is to investigate measurement of website quality and user satisfaction. More specifically, the paper reports on a study investigating whether users of high-quality public websites are more satisfied than those of low-quality websites. Adopting a human–computer interaction...... perspective, we have gathered data from the 2009 public website awards in Scandinavia. Our analysis of Norwegian and Danish websites reveals that the use of quality criteria is highly technical compared to the traditional usability testing focus on efficiency, effectiveness and satisfaction of the actual...... system use by representatives. A Pearson correlation analysis of user evaluation from 296 websites that participated in the Danish web award Bedst på Nettet (‘Top of the Web’) showed no significant positive correlation between website quality and user satisfaction. We put forward recommendations...

  20. Replacement of annular domain with trapezoidal domain in computational modeling of nonaqueous-phase-liquid dissolution-front propagation problems

    Institute of Scientific and Technical Information of China (English)

    ZHAO Chong-bin; Thomas POULET; Klaus REGENAUER-LIEB

    2015-01-01

    In order to simulate the instability phenomenon of a nonaqueous phase liquid (NAPL) dissolution front in a computational model, the intrinsic characteristic length is commonly used to determine the length scale at which the instability of the NAPL dissolution front can be initiated. This will require a huge number of finite elements if a whole NAPL dissolution system is simulated in the computational model. Even though modern supercomputers might be used to tackle this kind of NAPL dissolution problem, it can become prohibitive for commonly-used personal computers to do so. The main purpose of this work is to investigate whether or not the whole NAPL dissolution system of an annular domain can be replaced by a trapezoidal domain, so as to greatly reduce the requirements for computer efforts. The related simulation results have demonstrated that when the NAPL dissolution system under consideration is in a subcritical state, if the dissolution pattern around the entrance of an annulus domain is of interest, then a trapezoidal domain cannot be used to replace an annular domain in the computational simulation of the NAPL dissolution system. However, if the dissolution pattern away from the vicinity of the entrance of an annulus domain is of interest, then a trapezoidal domain can be used to replace an annular domain in the computational simulation of the NAPL dissolution system. When the NAPL dissolution system under consideration is in a supercritical state, a trapezoidal domain cannot be used to replace an annular domain in the computational simulation of the NAPL dissolution system.

  1. THE DOMAIN DECOMPOSITION TECHNIQUES FOR THE FINITE ELEMENT PROBABILITY COMPUTATIONAL METHODS

    Institute of Scientific and Technical Information of China (English)

    LIU Xiaoqi

    2000-01-01

    In this paper, we shall study the domain decomposition techniques for the finite element probability computational methods. These techniques provide a theoretical basis for parallel probability computational methods.

  2. Linearized Aeroelastic Computations in the Frequency Domain Based on Computational Fluid Dynamics

    CERN Document Server

    Amsallem, David; Choi, Youngsoo; Farhat, Charbel

    2015-01-01

    An iterative, CFD-based approach for aeroelastic computations in the frequency domain is presented. The method relies on a linearized formulation of the aeroelastic problem and a fixed-point iteration approach and enables the computation of the eigenproperties of each of the wet aeroelastic eigenmodes. Numerical experiments on the aeroelastic analysis and design optimization of two wing configurations illustrate the capability of the method for the fast and accurate aeroelastic analysis of aircraft configurations and its advantage over classical time-domain approaches.

  3. Computational electrodynamics the finite-difference time-domain method

    CERN Document Server

    Taflove, Allen

    2005-01-01

    This extensively revised and expanded third edition of the Artech House bestseller, Computational Electrodynamics: The Finite-Difference Time-Domain Method, offers engineers the most up-to-date and definitive resource on this critical method for solving Maxwell's equations. The method helps practitioners design antennas, wireless communications devices, high-speed digital and microwave circuits, and integrated optical devices with unsurpassed efficiency. There has been considerable advancement in FDTD computational technology over the past few years, and the third edition brings professionals the very latest details with entirely new chapters on important techniques, major updates on key topics, and new discussions on emerging areas such as nanophotonics. What's more, to supplement the third edition, the authors have created a Web site with solutions to problems, downloadable graphics and videos, and updates, making this new edition the ideal textbook on the subject as well.

  4. Domain analysis of computational science - Fifty years of a scientific computing group

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, M.

    2010-02-23

    I employed bibliometric- and historical-methods to study the domain of the Scientific Computing group at Brookhaven National Laboratory (BNL) for an extended period of fifty years, from 1958 to 2007. I noted and confirmed the growing emergence of interdisciplinarity within the group. I also identified a strong, consistent mathematics and physics orientation within it.

  5. Molecular scaffold analysis of natural products databases in the public domain.

    Science.gov (United States)

    Yongye, Austin B; Waddell, Jacob; Medina-Franco, José L

    2012-11-01

    Natural products represent important sources of bioactive compounds in drug discovery efforts. In this work, we compiled five natural products databases available in the public domain and performed a comprehensive chemoinformatic analysis focused on the content and diversity of the scaffolds with an overview of the diversity based on molecular fingerprints. The natural products databases were compared with each other and with a set of molecules obtained from in-house combinatorial libraries, and with a general screening commercial library. It was found that publicly available natural products databases have different scaffold diversity. In contrast to the common concept that larger libraries have the largest scaffold diversity, the largest natural products collection analyzed in this work was not the most diverse. The general screening library showed, overall, the highest scaffold diversity. However, considering the most frequent scaffolds, the general reference library was the least diverse. In general, natural products databases in the public domain showed low molecule overlap. In addition to benzene and acyclic compounds, flavones, coumarins, and flavanones were identified as the most frequent molecular scaffolds across the different natural products collections. The results of this work have direct implications in the computational and experimental screening of natural product databases for drug discovery.

  6. An outlook on the nature of mental creations after belonging to public domain

    Directory of Open Access Journals (Sweden)

    Pedjman Mohammadi

    2015-05-01

    Full Text Available The public domain of copyright which considers the end of protection period, attempts to cause balance among the rights of authors, society and third parties. So at the end of financial rights of author’s protection period the possibility of free utilization of these literary works will be possible. But in this situation one of the controversial difficulties is the nature of these kinds of literary works which, according to some scholars, after belonging to public domain, they will change in to the Allowable. To approve their idea, they focus on common features existing in these literary works(works relating to public domain and the Allowable. On the other hand it is believed that literary works after belonging to public domain essentially works after belonging to public domain essentially due to lacking of scarcity element are not considered property at all.

  7. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    Science.gov (United States)

    Beg, Marijan; Pepper, Ryan A.; Fangohr, Hans

    2017-05-01

    Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i) the re-compilation of source code, (ii) the use of configuration files, (iii) the graphical user interface, and (iv) embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF). We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  8. Fast resolution of the neutron diffusion equation through public domain Ode codes

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, V.M.; Vidal, V.; Garayoa, J. [Universidad Politecnica de Valencia, Departamento de Sistemas Informaticos, Valencia (Spain); Verdu, G. [Universidad Politecnica de Valencia, Departamento de Ingenieria Quimica y Nuclear, Valencia (Spain); Gomez, R. [I.E.S. de Tavernes Blanques, Valencia (Spain)

    2003-07-01

    The time-dependent neutron diffusion equation is a partial differential equation with source terms. The resolution method usually includes discretizing the spatial domain, obtaining a large system of linear, stiff ordinary differential equations (ODEs), whose resolution is computationally very expensive. Some standard techniques use a fixed time step to solve the ODE system. This can result in errors (if the time step is too large) or in long computing times (if the time step is too little). To speed up the resolution method, two well-known public domain codes have been selected: DASPK and FCVODE that are powerful codes for the resolution of large systems of stiff ODEs. These codes can estimate the error after each time step, and, depending on this estimation can decide which is the new time step and, possibly, which is the integration method to be used in the next step. With these mechanisms, it is possible to keep the overall error below the chosen tolerances, and, when the system behaves smoothly, to take large time steps increasing the execution speed. In this paper we address the use of the public domain codes DASPK and FCVODE for the resolution of the time-dependent neutron diffusion equation. The efficiency of these codes depends largely on the preconditioning of the big systems of linear equations that must be solved. Several pre-conditioners have been programmed and tested; it was found that the multigrid method is the best of the pre-conditioners tested. Also, it has been found that DASPK has performed better than FCVODE, being more robust for our problem.We can conclude that the use of specialized codes for solving large systems of ODEs can reduce drastically the computational work needed for the solution; and combining them with appropriate pre-conditioners, the reduction can be still more important. It has other crucial advantages, since it allows the user to specify the allowed error, which cannot be done in fixed step implementations; this, of course

  9. Time-Domain Terahertz Computed Axial Tomography NDE System

    Science.gov (United States)

    Zimdars, David

    2012-01-01

    NASA has identified the need for advanced non-destructive evaluation (NDE) methods to characterize aging and durability in aircraft materials to improve the safety of the nation's airline fleet. 3D THz tomography can play a major role in detection and characterization of flaws and degradation in aircraft materials, including Kevlar-based composites and Kevlar and Zylon fabric covers for soft-shell fan containment where aging and durability issues are critical. A prototype computed tomography (CT) time-domain (TD) THz imaging system has been used to generate 3D images of several test objects including a TUFI tile (a thermal protection system tile used on the Space Shuttle and possibly the Orion or similar capsules). This TUFI tile had simulated impact damage that was located and the depth of damage determined. The CT motion control gan try was designed and constructed, and then integrated with a T-Ray 4000 control unit and motion controller to create a complete CT TD-THz imaging system prototype. A data collection software script was developed that takes multiple z-axis slices in sequence and saves the data for batch processing. The data collection software was integrated with the ability to batch process the slice data with the CT TD-THz image reconstruction software. The time required to take a single CT slice was decreased from six minutes to approximately one minute by replacing the 320 ps, 100-Hz waveform acquisition system with an 80 ps, 1,000-Hz waveform acquisition system. The TD-THZ computed tomography system was built from pre-existing commercial off-the-shelf subsystems. A CT motion control gantry was constructed from COTS components that can handle larger samples. The motion control gantry allows inspection of sample sizes of up to approximately one cubic foot (.0.03 cubic meters). The system reduced to practice a CT-TDTHz system incorporating a COTS 80- ps/l-kHz waveform scanner. The incorporation of this scanner in the system allows acquisition of 3D

  10. 3D Vectorial Time Domain Computational Integrated Photonics

    Energy Technology Data Exchange (ETDEWEB)

    Kallman, J S; Bond, T C; Koning, J M; Stowell, M L

    2007-02-16

    The design of integrated photonic structures poses considerable challenges. 3D-Time-Domain design tools are fundamental in enabling technologies such as all-optical logic, photonic bandgap sensors, THz imaging, and fast radiation diagnostics. Such technologies are essential to LLNL and WFO sponsors for a broad range of applications: encryption for communications and surveillance sensors (NSA, NAI and IDIV/PAT); high density optical interconnects for high-performance computing (ASCI); high-bandwidth instrumentation for NIF diagnostics; micro-sensor development for weapon miniaturization within the Stockpile Stewardship and DNT programs; and applications within HSO for CBNP detection devices. While there exist a number of photonics simulation tools on the market, they primarily model devices of interest to the communications industry. We saw the need to extend our previous software to match the Laboratory's unique emerging needs. These include modeling novel material effects (such as those of radiation induced carrier concentrations on refractive index) and device configurations (RadTracker bulk optics with radiation induced details, Optical Logic edge emitting lasers with lateral optical inputs). In addition we foresaw significant advantages to expanding our own internal simulation codes: parallel supercomputing could be incorporated from the start, and the simulation source code would be accessible for modification and extension. This work addressed Engineering's Simulation Technology Focus Area, specifically photonics. Problems addressed from the Engineering roadmap of the time included modeling the Auston switch (an important THz source/receiver), modeling Vertical Cavity Surface Emitting Lasers (VCSELs, which had been envisioned as part of fast radiation sensors), and multi-scale modeling of optical systems (for a variety of applications). We proposed to develop novel techniques to numerically solve the 3D multi-scale propagation problem for both the

  11. The international river interface cooperative: Public domain flow and morphodynamics software for education and applications

    Science.gov (United States)

    Nelson, Jonathan M.; Shimizu, Yasuyuki; Abe, Takaaki; Asahi, Kazutake; Gamou, Mineyuki; Inoue, Takuya; Iwasaki, Toshiki; Kakinuma, Takaharu; Kawamura, Satomi; Kimura, Ichiro; Kyuka, Tomoko; McDonald, Richard R.; Nabi, Mohamed; Nakatsugawa, Makoto; Simões, Francisco R.; Takebayashi, Hiroshi; Watanabe, Yasunori

    2016-07-01

    This paper describes a new, public-domain interface for modeling flow, sediment transport and morphodynamics in rivers and other geophysical flows. The interface is named after the International River Interface Cooperative (iRIC), the group that constructed the interface and many of the current solvers included in iRIC. The interface is entirely free to any user and currently houses thirteen models ranging from simple one-dimensional models through three-dimensional large-eddy simulation models. Solvers are only loosely coupled to the interface so it is straightforward to modify existing solvers or to introduce other solvers into the system. Six of the most widely-used solvers are described in detail including example calculations to serve as an aid for users choosing what approach might be most appropriate for their own applications. The example calculations range from practical computations of bed evolution in natural rivers to highly detailed predictions of the development of small-scale bedforms on an initially flat bed. The remaining solvers are also briefly described. Although the focus of most solvers is coupled flow and morphodynamics, several of the solvers are also specifically aimed at providing flood inundation predictions over large spatial domains. Potential users can download the application, solvers, manuals, and educational materials including detailed tutorials at www.-i-ric.org. The iRIC development group encourages scientists and engineers to use the tool and to consider adding their own methods to the iRIC suite of tools.

  12. The international river interface cooperative: Public domain flow and morphodynamics software for education and applications

    Science.gov (United States)

    Nelson, Jonathan M.; Shimizu, Yasuyuki; Abe, Takaaki; Asahi, Kazutake; Gamou, Mineyuki; Inoue, Takuya; Iwasaki, Toshiki; Kakinuma, Takaharu; Kawamura, Satomi; Kimura, Ichiro; Kyuka, Tomoko; McDonald, Richard R.; Nabi, Mohamed; Nakatsugawa, Makoto; Simoes, Francisco J.; Takebayashi, Hiroshi; Watanabe, Yasunori

    2016-01-01

    This paper describes a new, public-domain interface for modeling flow, sediment transport and morphodynamics in rivers and other geophysical flows. The interface is named after the International River Interface Cooperative (iRIC), the group that constructed the interface and many of the current solvers included in iRIC. The interface is entirely free to any user and currently houses thirteen models ranging from simple one-dimensional models through three-dimensional large-eddy simulation models. Solvers are only loosely coupled to the interface so it is straightforward to modify existing solvers or to introduce other solvers into the system. Six of the most widely-used solvers are described in detail including example calculations to serve as an aid for users choosing what approach might be most appropriate for their own applications. The example calculations range from practical computations of bed evolution in natural rivers to highly detailed predictions of the development of small-scale bedforms on an initially flat bed. The remaining solvers are also briefly described. Although the focus of most solvers is coupled flow and morphodynamics, several of the solvers are also specifically aimed at providing flood inundation predictions over large spatial domains. Potential users can download the application, solvers, manuals, and educational materials including detailed tutorials at www.-i-ric.org. The iRIC development group encourages scientists and engineers to use the tool and to consider adding their own methods to the iRIC suite of tools.

  13. The Use and Abuse of Research in the Public Domain

    Science.gov (United States)

    Reid, Alan

    2016-01-01

    In Australia, education think tanks have become increasingly influential in policy circles through "reports" to government, and in public debate through the mainstream media. Invariably think-tanks draw on educational research to lend authority and legitimacy to their work. This is desirable if the research deepens understandings about…

  14. Computer-Assisted Management of Instruction in Veterinary Public Health

    Science.gov (United States)

    Holt, Elsbeth; And Others

    1975-01-01

    Reviews a course in Food Hygiene and Public Health at the University of Illinois College of Veterinary Medicine in which students are sequenced through a series of computer-based lessons or autotutorial slide-tape lessons, the computer also being used to route, test, and keep records. Since grades indicated mastery of the subject, the course will…

  15. Wildlife software: procedures for publication of computer software

    Science.gov (United States)

    Samuel, M.D.

    1990-01-01

    Computers and computer software have become an integral part of the practice of wildlife science. Computers now play an important role in teaching, research, and management applications. Because of the specialized nature of wildlife problems, specific computer software is usually required to address a given problem (e.g., home range analysis). This type of software is not usually available from commercial vendors and therefore must be developed by those wildlife professionals with particular skill in computer programming. Current journal publication practices generally prevent a detailed description of computer software associated with new techniques. In addition, peer review of journal articles does not usually include a review of associated computer software. Thus, many wildlife professionals are usually unaware of computer software that would meet their needs or of major improvements in software they commonly use. Indeed most users of wildlife software learn of new programs or important changes only by word of mouth.

  16. Code and papers: computing publication patterns in the LHC era

    CERN Document Server

    CERN. Geneva

    2012-01-01

    Publications in scholarly journals establish the body of knowledge deriving from scientific research; they also play a fundamental role in the career path of scientists and in the evaluation criteria of funding agencies. This presentation reviews the evolution of computing-oriented publications in HEP following the start of operation of LHC. Quantitative analyses are illustrated, which document the production of scholarly papers on computing-related topics by HEP experiments and core tools projects (including distributed computing R&D), and the citations they receive. Several scientometric indicators are analyzed to characterize the role of computing in HEP literature. Distinctive features of scholarly publication production in the software-oriented and hardware-oriented experimental HEP communities are highlighted. Current patterns and trends are compared to the situation in previous generations' HEP experiments at LEP, Tevatron and B-factories. The results of this scientometric analysis document objec...

  17. Computational Analysis of the Binding Specificities of PH Domains

    Directory of Open Access Journals (Sweden)

    Zhi Jiang

    2015-01-01

    Full Text Available Pleckstrin homology (PH domains share low sequence identities but extremely conserved structures. They have been found in many proteins for cellular signal-dependent membrane targeting by binding inositol phosphates to perform different physiological functions. In order to understand the sequence-structure relationship and binding specificities of PH domains, quantum mechanical (QM calculations and sequence-based combined with structure-based binding analysis were employed in our research. In the structural aspect, the binding specificities were shown to correlate with the hydropathy characteristics of PH domains and electrostatic properties of the bound inositol phosphates. By comparing these structure properties with sequence-based profiles of physicochemical properties, PH domains can be classified into four functional subgroups according to their binding specificities and affinities to inositol phosphates. The method not only provides a simple and practical paradigm to predict binding specificities for functional genomic research but also gives new insight into the understanding of the basis of diseases with respect to PH domain structures.

  18. Suburban development – a search for public domains in Danish suburban neighbourhoods

    DEFF Research Database (Denmark)

    Melgaard, Bente; Bech-Danielsen, Claus

    , potentials for bridge-building across the enclaves of the suburb are looked for through a combined architectural-anthropological mapping of public spaces in a specific suburb in Denmark, the analyses being carried out in the light of Hajer & Reijndorp’s definition of public domains and the term exchange....... The results so far show that suburban spaces with a potential for creating bridge-building across the segregated enclaves do exist but that, among other things, focus on spatial design is needed before actual public domains creating the basis for exchange are achieved....

  19. Materialities of Law: Celebrity Production and the Public Domain

    Directory of Open Access Journals (Sweden)

    Esther Milne

    2009-12-01

    Full Text Available Celebrity production and consumption are powerful socio-economic forces. The celebrity functions as a significant economic resource for the commercial sector and plays a fundamental symbolic role within culture by providing a shared ‘vocabulary’ through which to understand contemporary social relations. A pivotal element of this allure is the process by which the celebrity figure is able to forge an intimate link with its audience, often producing public expressions of profound compassion, respect or revulsion. This process, however, is complicated by emerging participatory media forms whose impact is experienced as new conditions of possibility for celebrity production and consumption. As Marshall argues, video mash-ups of celebrity interviews, such as those of Christian Bale or Tom Cruise, are dramatically changing the relation between celebrity and audience (Marshall, 2006: 640. Meanings produced by these audience remixes challenge the extent to which a celebrity might control her image. So is the celebrity personality, therefore, a public or private commodity? Who owns the celebrity image within remix culture? Although the celebrity figure has been thoroughly researched in relation to its patterns of consumption; semiotic power; and industry construction; less attention has been focused on the forms of celebrity governance enabled by legislative and case law settings. How might the law deal with the significant economic and cultural power exercised within celebrity culture?

  20. The Domain Shared by Computational and Digital Ontology: A Phenomenological Exploration and Analysis

    Science.gov (United States)

    Compton, Bradley Wendell

    2009-01-01

    The purpose of this dissertation is to explore and analyze a domain of research thought to be shared by two areas of philosophy: computational and digital ontology. Computational ontology is philosophy used to develop information systems also called computational ontologies. Digital ontology is philosophy dealing with our understanding of Being…

  1. High-Performance Computational Electromagnetics in Frequency-Domain and Time-Domain

    Science.gov (United States)

    2015-03-04

    aforementioned contributions, for a given 1In view of its applications to seismic wave propagation Dr. Amlani’s PhD thesis received two awards at Cal- tech, one...for wave scattering problems. PhD thesis, California Institute of Technol- ogy, 2014. Available at http://www.its.caltech.edu/~obruno/preprints...solutions for some of the most challenging scattering problems in science and engineering. Electromagnetic scattering . Frequency domain solvers. Integral

  2. Assessing water availability over peninsular Malaysia using public domain satellite data products

    Science.gov (United States)

    Ali, M. I.; Hashim, M.; Zin, H. S. M.

    2014-02-01

    Water availability monitoring is an essential task for water resource sustainability and security. In this paper, the assessment of satellite remote sensing technique for determining water availability is reported. The water-balance analysis is used to compute the spatio-temporal water availability with main inputs; the precipitation and actual evapotranspiration rate (AET), both fully derived from public-domain satellite products of Tropical Rainfall Measurement Mission (TRMM) and MODIS, respectively. Both these satellite products were first subjected to calibration to suit corresponding selected local precipitation and AET samples. Multi-temporal data sets acquired 2000-2010 were used in this study. The results of study, indicated strong agreement of monthly water availability with the basin flow rate (r2 = 0.5, p < 0.001). Similar agreements were also noted between the estimated annual average water availability with the in-situ measurement. It is therefore concluded that the method devised in this study provide a new alternative for water availability mapping over large area, hence offers the only timely and cost-effective method apart from providing comprehensive spatio-temporal patterns, crucial in water resource planning to ensure water security.

  3. Funding Public Computing Centers: Balancing Broadband Availability and Expected Demand

    Science.gov (United States)

    Jayakar, Krishna; Park, Eun-A

    2012-01-01

    The National Broadband Plan (NBP) recently announced by the Federal Communication Commission visualizes a significantly enhanced commitment to public computing centers (PCCs) as an element of the Commission's plans for promoting broadband availability. In parallel, the National Telecommunications and Information Administration (NTIA) has…

  4. A domain decomposition study of massively parallel computing in compressible gas dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Wong, C.C.; Blottner, F.G.; Payne, J.L. [Sandia National Labs., Albuquerque, NM (United States); Soetrisno, M. [Amtec Engineering, Inc., Bellevue, WA (United States)

    1995-01-01

    The appropriate utilization of massively parallel computers for solving the Navier-Stokes equations is investigated and determined from an engineering perspective. The issues investigated are: (1) Should strip or patch domain decomposition of the spatial mesh be used to reduce computer time? (2) How many computer nodes should be used for a problem with a given sized mesh to reduce computer time? (3) Is the convergence of the Navier-Stokes solution procedure (LU-SGS) adversely influenced by the domain decomposition approach? The results of the paper show that the present Navier-Stokes solution technique has good performance on a massively parallel computer for transient flow problems. For steady-state problems with a large number of mesh cells, the solution procedure will require significant computer time due to an increased number of iterations to achieve a converged solution. There is an optimum number of computer nodes to use for a problem with a given global mesh size.

  5. Survey of Energy Computing in the Smart Grid Domain

    Directory of Open Access Journals (Sweden)

    Rajesh Kumar

    2013-07-01

    Full Text Available Resource optimization, with advance computing tools, improves the efficient use of energy resources. The renewable energy resources are instantaneous and needs to be conserve at the same time. To optimize real time process, the complex design, includes plan of resources and control for effective utilization. The advances in information communication technology tools enables data formatting and analysis results in optimization of use the renewable resources for sustainable energy solution on smart grid.The paper presents energy computing models for optimally allocating different types of renewable in the distribution system so as to minimize energy loss. The proposed energy computing model optimizes the integration of renewable energy resources with technical and financial feasibility. An econometric model identifies the potential of renewable energy sources, mapping them for computational analysis, which enables the study to forecast the demand and supply scenario. The enriched database on renewable sources and Government policies customize delivery model for potential to transcend the costs vs. benefits barrier. The simulation and modeling techniques have overtaken the drawbacks of traditional information and communication technology (ICT in tackling the new challenges in maximizing the benefits with smart hybrid grid. Data management has to start at the initial reception of the energy source data, reviewing it for events that should trigger alarms into outage management systems and other real-time systems such as portfolio management of a virtual hybrid power plant operator. The paper highlighted two renewable source, solar and wind, for the study in this paper, which can extend to other renewable sources.

  6. The finite difference time domain method on a massively parallel computer

    NARCIS (Netherlands)

    Ewijk, L.J. van

    1996-01-01

    At the Physics and Electronics Laboratory TNO much research is done in the field of computational electromagnetics (CEM). One of the tools in this field is the Finite Difference Time Domain method (FDTD), a method that has been implemented in a program in order to be able to compute electromagnetic

  7. Computing the Feng-Rao distances for codes from order domains

    DEFF Research Database (Denmark)

    Ruano Benito, Diego

    2007-01-01

    We compute the Feng–Rao distance of a code coming from an order domain with a simplicial value semigroup. The main tool is the Apéry set of a semigroup that can be computed using a Gröbner basis....

  8. Computer Advisory System in the Domain of Copper Alloys Manufacturing

    Directory of Open Access Journals (Sweden)

    Wilk-Kołodziejczyk D.

    2015-09-01

    Full Text Available The main scope of the article is the development of a computer system, which should give advices at problem of cooper alloys manufacturing. This problem relates with choosing of an appropriate type of bronze (e.g. the BA 1044 bronze with possible modification (e.g. calcium carbide modifications: Ca + C or CaC2 and possible heat treatment operations (quenching, tempering in order to obtain desired mechanical properties of manufactured material described by tensile strength - Rm, yield strength - Rp0.2 and elongation - A5. By construction of the computer system being the goal of presented here work Case-based Reasoning is proposed to be used. Case-based Reasoning is the methodology within Artificial Intelligence techniques, which enables solving new problems basing on experiences that are solutions obtained in the past. Case-based Reasoning also enables incremental learning, because every new experience is retained each time in order to be available for future processes of problem solving. Proposed by the developed system solution can be used by a technologist as a rough solution for cooper alloys manufacturing problem, which requires further tests in order to confirm it correctness.

  9. Safety features in nuclear power plants to eliminate the need of emergency planning in public domain

    Indian Academy of Sciences (India)

    P K Vijayan; M T Kamble; A K Nayak; K K Vaze; R K Sinha

    2013-10-01

    Following the Fukushima accident, the safety features of Nuclear Power Plants (NPP) are being re-examined worldwide including India to demonstrate capabilities to cope with severe accidents. In order to restore public confidence and support for nuclear power, it is felt necessary to design future NPPs with near zero impact outside the plant boundary and thus enabling elimination of emergency planning in public domain. Authors have identified a set of safety features which are needed to be incorporated in advanced reactors to achieve this goal. These features enabling prevention, termination, mitigation and containment of radioactivity for beyond design basis accidents arising from extreme natural events are essential for achieving the goal of elimination of emergency planning in public domain. Inherent safety characteristics, passive and engineered safety features to achieve these functions are discussed in this paper. Present trends and future developments in this direction are also described briefly.

  10. Novel Techniques for Secure Use of Public Cloud Computing Resources

    Science.gov (United States)

    2015-09-17

    of a few common standard assumptions presented in this format : Discrete Logarithm (DL)[1]: Let G be a cyclic group with generator g. Let D be the...for the system. AuthoritySetup(GP)→ MS KA, APA Each authority runs the authority setup algorithm with the global parameters GP and produces a master...secret key MS KA and some public authority parameters APA for an authority A. DomainKeyGeneration(GP, MS KA, APA , IDdomain = 〈id1, ..., id`〉, h′)→ DS K

  11. 32 CFR 644.24 - Acquisition by Transfer from other Government Departments or Agencies (except Public Domain).

    Science.gov (United States)

    2010-07-01

    ... Departments or Agencies (except Public Domain). 644.24 Section 644.24 National Defense Department of Defense... Departments or Agencies (except Public Domain). When a requirement develops for the acquisition of Government... of existing improvements, the estimated cost of the proposed construction, attitude of the...

  12. Towards development of a high quality public domain global roads database

    Directory of Open Access Journals (Sweden)

    Andrew Nelson

    2006-12-01

    Full Text Available There is clear demand for a global spatial public domain roads data set with improved geographic and temporal coverage, consistent coding of road types, and clear documentation of sources. The currently best available global public domain product covers only one-quarter to one-third of the existing road networks, and this varies considerably by region. Applications for such a data set span multiple sectors and would be particularly valuable for the international economic development, disaster relief, and biodiversity conservation communities, not to mention national and regional agencies and organizations around the world. The building blocks for such a global product are available for many countries and regions, yet thus far there has been neither strategy nor leadership for developing it. This paper evaluates the best available public domain and commercial data sets, assesses the gaps in global coverage, and proposes a number of strategies for filling them. It also identifies stakeholder organizations with an interest in such a data set that might either provide leadership or funding for its development. It closes with a proposed set of actions to begin the process.

  13. Experience of public procurement of Open Compute servers

    Science.gov (United States)

    Bärring, Olof; Guerri, Marco; Bonfillou, Eric; Valsan, Liviu; Grigore, Alexandru; Dore, Vincent; Gentit, Alain; Clement, Benoît; Grossir, Anthony

    2015-12-01

    The Open Compute Project. OCP (http://www.opencompute.org/). was launched by Facebook in 2011 with the objective of building efficient computing infrastructures at the lowest possible cost. The technologies are released as open hardware. with the goal to develop servers and data centres following the model traditionally associated with open source software projects. In 2013 CERN acquired a few OCP servers in order to compare performance and power consumption with standard hardware. The conclusions were that there are sufficient savings to motivate an attempt to procure a large scale installation. One objective is to evaluate if the OCP market is sufficiently mature and broad enough to meet the constraints of a public procurement. This paper summarizes this procurement. which started in September 2014 and involved the Request for information (RFI) to qualify bidders and Request for Tender (RFT).

  14. Time-domain CFD computation and analysis of acoustic attenuation performance of water-filled silencers

    Institute of Scientific and Technical Information of China (English)

    刘晨; 季振林; 程垠钟; 刘胜兰

    2016-01-01

    The multi-dimensional time-domain computational fluid dynamics (CFD) approach is extended to calculate the acoustic attenuation performance of water-filled piping silencers. Transmission loss predictions from the time-domain CFD approach and the frequency-domain finite element method (FEM) agree well with each other for the dual expansion chamber silencer, straight-through and cross-flow perforated tube silencers without flow. Then, the time-domain CFD approach is used to investigate the effect of flow on the acoustic attenuation characteristics of perforated tube silencers. The numerical predictions demonstrate that the mean flow increases the transmission loss, especially at higher frequencies, and shifts the transmission loss curve to lower frequencies.

  15. Reducing Dataset Size in Frequency Domain for Brain Computer Interface Motor Imagery Classification

    Directory of Open Access Journals (Sweden)

    Ch.Aparna

    2010-12-01

    Full Text Available Brain computer interface is an emerging area of research where the BCI system is able to detect and interpret the mental activity into computer interpretable signals opening a wide area of applications where activities can be completed without using muscular movement. In Brain Computer Interface research, for classification of EEG signals the raw signals captured has to undergo some preprocessing, to obtain the right attributes for classification. In this paper, we present a system which allows for classification of mental tasks based on a statistical data obtained in frequency domain using Discrete cosine transform and extracting useful frequencies from the same with application of decision tree algorithms for classification.

  16. Teacher Perspectives on the Current State of Computer Technology Integration into the Public School Classroom

    Science.gov (United States)

    Zuniga, Ramiro

    2009-01-01

    Since the introduction of computers into the public school arena over forty years ago, educators have been convinced that the integration of computer technology into the public school classroom will transform education. Joining educators are state and federal governments. Public schools and others involved in the process of computer technology…

  17. User interfaces for computational science: a domain specific language for OOMMF embedded in Python

    CERN Document Server

    Beg, Marijan; Fangohr, Hans

    2016-01-01

    Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i) the re-compilation of source code, (ii) the use of configuration files, (iii) the graphical user interface, and (iv) embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend ...

  18. Towards Domain Ontology Creation Based on a Taxonomy Structure in Computer Vision

    Directory of Open Access Journals (Sweden)

    Sadgal mohamed

    2016-02-01

    Full Text Available In computer vision to create a knowledge base usable by information systems, we need a data structure facilitating the information access. Artificial intelligence community uses the ontologies to structure and represent the domain knowledge. This information structure can be used as a database of many geographic information systems (GIS or information systems treating real objects for example road scenes, besides it can be utilized by other systems. For this, we provide a process to create a taxonomy structure based on new hierarchical image clustering method. The hierarchical relation is based on visual object features and contributes to build domain ontology.

  19. Applications of Out-of-Domain Knowledge in Students' Reasoning about Computer Program State

    Science.gov (United States)

    Lewis, Colleen Marie

    To meet a growing demand and a projected deficit in the supply of computer professionals (NCWIT, 2009), it is of vital importance to expand students' access to computer science. However, many researchers in the computer science education community unproductively assume that some students lack an innate ability for computer science and therefore cannot be successful learning to program. In contrast, I hypothesize that the degree to which computer science students make productive use of their out-of-domain knowledge can better explain the range of success of novices learning to program. To investigate what non-programming knowledge supports students' success, I conducted and videotaped approximately 40 hours of clinical interviews with 30 undergraduate students enrolled in introductory programming courses. During each interview, a participant talked as they solved programming problems, many of which were multiple-choice problems that were highly correlated with success on an Advanced Placement Computer Science exam. In the analysis of the interviews I focused on students' strengths rather than the typical decision to focus on students' weaknesses. I documented specific competencies of the participants and applied analytic tools from the Knowledge in Pieces theoretical framework (diSessa, 1993) to attempt to understand the source and nature of these competencies. I found that participants appeared to build upon several kinds of out-of-domain knowledge. For example, many students used algebraic substitution techniques when tracing the state of recursive functions. Students appeared to use metaphors and their intuitive knowledge of both iteration and physics to understand infinite loops and base cases. On the level of an individual students' reasoning, a case study analysis illustrated the ways in which a participant integrated her linguistic knowledge of "and" into her reasoning about the computer science command "and." In addition to identifying these specific

  20. Syntax without language: neurobiological evidence for cross-domain syntactic computations.

    Science.gov (United States)

    Tettamanti, Marco; Rotondi, Irene; Perani, Daniela; Scotti, Giuseppe; Fazio, Ferruccio; Cappa, Stefano F; Moro, Andrea

    2009-01-01

    Not all conceivable grammars are realized within human languages. Rules based on rigid distances, in which a certain word must occur at a fixed distance from another word, are never found in grammars of human languages. Distances between words are specified in terms of relative, non-rigid positions. The left inferior frontal gyrus (IFG) (Broca's area) has been found to be involved in the computation of non-rigid but not of rigid syntax in the language domain. A fundamental question is therefore whether the neural activity underlying this non-rigid architecture is language-specific, given that analogous structural properties can be found in other cognitive domains. Using event-related functional magnetic resonance imaging (fMRI) in sixteen healthy native speakers of Italian, we measured brain activity for the acquisition of rigid and non-rigid syntax in the visuo-spatial domain. The data of the present experiment were formally compared with those of a previous experiment, in which there was a symmetrical distinction between rigid and non-rigid syntax in the language domain. Both in the visuo-spatial and in the language domain, the acquisition of non-rigid syntax, but not the acquisition of rigid syntax, activated Brodmann Area 44 of the left IFG. This domain-independent effect was specifically modulated by performance improvement. Thus, in the human brain, one single "grammar without words" serves different higher cognitive functions.

  1. Adaptive, multi-domain techniques for two-phase flow computations

    Science.gov (United States)

    Uzgoren, Eray

    Computations of immiscible two-phase flows deal with interfaces that may move and/or deform in response to the dynamics within the flow field. As interfaces move, one needs to compute the new shapes and the associated geometric information (such as curvatures, normals, and projected areas/volumes) as part of the solution. The present study employs the immersed boundary method (IBM), which uses marker points to track the interface location and continuous interface methods to model interfacial conditions. The large transport property jumps across the interface, and the considerations of the mechanism including convection, diffusion, pressure, body force and surface tension create multiple time/length scales. The resulting computational stiffness and moving boundaries make numerical simulations computationally expensive in three-dimensions, even when the computations are performed on adaptively refined 3D Cartesian grids that efficiently resolve the length scales. A domain decomposition method and a partitioning strategy for adaptively refined grids are developed to enable parallel computing capabilities. Specifically, the approach consists of multilevel additive Schwarz method for domain decomposition, and Hilbert space filling curve ordering for partitioning. The issues related to load balancing, communication and computation, convergence rate of the iterative solver in regard to grid size and the number of sub-domains and interface shape deformation, are studied. Moreover, interfacial representation using marker points is extended to model complex solid geometries for single and two-phase flows. Developed model is validated using a benchmark test case, flow over a cylinder. Furthermore, overall algorithm is employed to further investigate steady and unsteady behavior of the liquid plug problem. Finally, capability of handling two-phase flow simulations in complex solid geometries is demonstrated by studying the effect of bifurcation point on the liquid plug, which

  2. Spatiotemporal Domain Decomposition for Massive Parallel Computation of Space-Time Kernel Density

    Science.gov (United States)

    Hohl, A.; Delmelle, E. M.; Tang, W.

    2015-07-01

    Accelerated processing capabilities are deemed critical when conducting analysis on spatiotemporal datasets of increasing size, diversity and availability. High-performance parallel computing offers the capacity to solve computationally demanding problems in a limited timeframe, but likewise poses the challenge of preventing processing inefficiency due to workload imbalance between computing resources. Therefore, when designing new algorithms capable of implementing parallel strategies, careful spatiotemporal domain decomposition is necessary to account for heterogeneity in the data. In this study, we perform octtree-based adaptive decomposition of the spatiotemporal domain for parallel computation of space-time kernel density. In order to avoid edge effects near subdomain boundaries, we establish spatiotemporal buffers to include adjacent data-points that are within the spatial and temporal kernel bandwidths. Then, we quantify computational intensity of each subdomain to balance workloads among processors. We illustrate the benefits of our methodology using a space-time epidemiological dataset of Dengue fever, an infectious vector-borne disease that poses a severe threat to communities in tropical climates. Our parallel implementation of kernel density reaches substantial speedup compared to sequential processing, and achieves high levels of workload balance among processors due to great accuracy in quantifying computational intensity. Our approach is portable of other space-time analytical tests.

  3. Daas: A Web-based System for User-specific Dietary Analysis and Advice for the Public Healthcare Domain

    Institute of Scientific and Technical Information of China (English)

    Deirdre Nugent; Kudakwashe Dube; Wu Bing

    2003-01-01

    This paper presents a Dietary Analysis and Advice System (DAAS), a web-based system for providing, within the public healthcare domain, user-specific diet advice based on a preliminary analysis of current diet or eating habits and lifestyle, using knowledge from domain expertise and experts' interpretation of national dietary guidelines.

  4. Open access high throughput drug discovery in the public domain: a Mount Everest in the making.

    Science.gov (United States)

    Roy, Anuradha; McDonald, Peter R; Sittampalam, Sitta; Chaguturu, Rathnam

    2010-11-01

    High throughput screening (HTS) facilitates screening large numbers of compounds against a biochemical target of interest using validated biological or biophysical assays. In recent years, a significant number of drugs in clinical trails originated from HTS campaigns, validating HTS as a bona fide mechanism for hit finding. In the current drug discovery landscape, the pharmaceutical industry is embracing open innovation strategies with academia to maximize their research capabilities and to feed their drug discovery pipeline. The goals of academic research have therefore expanded from target identification and validation to probe discovery, chemical genomics, and compound library screening. This trend is reflected in the emergence of HTS centers in the public domain over the past decade, ranging in size from modestly equipped academic screening centers to well endowed Molecular Libraries Probe Centers Network (MLPCN) centers funded by the NIH Roadmap initiative. These centers facilitate a comprehensive approach to probe discovery in academia and utilize both classical and cutting-edge assay technologies for executing primary and secondary screening campaigns. The various facets of academic HTS centers as well as their implications on technology transfer and drug discovery are discussed, and a roadmap for successful drug discovery in the public domain is presented. New lead discovery against therapeutic targets, especially those involving the rare and neglected diseases, is indeed a Mount Everestonian size task, and requires diligent implementation of pharmaceutical industry's best practices for a successful outcome.

  5. Memristor standard cellular neural networks computing in the flux-charge domain.

    Science.gov (United States)

    Di Marco, Mauro; Forti, Mauro; Pancioni, Luca

    2017-09-01

    The paper introduces a class of memristor neural networks (NNs) that are characterized by the following salient features. (a) The processing of signals takes place in the flux-charge domain and is based on the time evolution of memristor charges. The processing result is given by the constant asymptotic values of charges that are stored in the memristors acting as non-volatile memories in steady state. (b) The dynamic equations describing the memristor NNs in the flux-charge domain are analogous to those describing, in the traditional voltage-current domain, the dynamics of a standard (S) cellular (C) NN, and are implemented by using a realistic model of memristors as that proposed by HP. This analogy makes it possible to use the bulk of results in the SCNN literature for designing memristor NNs to solve processing tasks in real time. Convergence of memristor NNs in the presence of multiple asymptotically stable equilibrium points is addressed and some applications to image processing tasks are presented to illustrate the real-time processing capabilities. Computing in the flux-charge domain is shown to have significant advantages with respect to computing in the voltage-current domain. One advantage is that, when a steady state is reached, currents, voltages and hence power in a memristor NN vanish, whereas memristors keep in memory the processing result. This is basically different from SCNNs for which currents, voltages and power do not vanish at a steady state, and batteries are needed to keep in memory the processing result. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Computational models of music perception and cognition II: Domain-specific music processing

    Science.gov (United States)

    Purwins, Hendrik; Grachten, Maarten; Herrera, Perfecto; Hazan, Amaury; Marxer, Ricard; Serra, Xavier

    2008-09-01

    In Part I [Purwins H, Herrera P, Grachten M, Hazan A, Marxer R, Serra X. Computational models of music perception and cognition I: The perceptual and cognitive processing chain. Physics of Life Reviews 2008, in press, doi:10.1016/j.plrev.2008.03.004], we addressed the study of cognitive processes that underlie auditory perception of music, and their neural correlates. The aim of the present paper is to summarize empirical findings from music cognition research that are relevant to three prominent music theoretic domains: rhythm, melody, and tonality. Attention is paid to how cognitive processes like category formation, stimulus grouping, and expectation can account for the music theoretic key concepts in these domains, such as beat, meter, voice, consonance. We give an overview of computational models that have been proposed in the literature for a variety of music processing tasks related to rhythm, melody, and tonality. Although the present state-of-the-art in computational modeling of music cognition definitely provides valuable resources for testing specific hypotheses and theories, we observe the need for models that integrate the various aspects of music perception and cognition into a single framework. Such models should be able to account for aspects that until now have only rarely been addressed in computational models of music cognition, like the active nature of perception and the development of cognitive capacities from infancy to adulthood.

  7. 32 CFR 310.52 - Computer matching publication and review requirements.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 2 2010-07-01 2010-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify...

  8. Computation of the acoustic radiation force using the finite-difference time-domain method.

    Science.gov (United States)

    Cai, Feiyan; Meng, Long; Jiang, Chunxiang; Pan, Yu; Zheng, Hairong

    2010-10-01

    The computational details related to calculating the acoustic radiation force on an object using a 2-D grid finite-difference time-domain method (FDTD) are presented. The method is based on propagating the stress and velocity fields through the grid and determining the energy flow with and without the object. The axial and radial acoustic radiation forces predicted by FDTD method are in excellent agreement with the results obtained by analytical evaluation of the scattering method. In particular, the results indicate that it is possible to trap the steel cylinder in the radial direction by optimizing the width of Gaussian source and the operation frequency. As the sizes of the relating objects are smaller than or comparable to wavelength, the algorithm presented here can be easily extended to 3-D and include torque computation algorithms, thus providing a highly flexible and universally usable computation engine.

  9. Tuning Time-Domain Pseudospectral Computations of the Self-Force on a Charged Scalar Particle

    CERN Document Server

    Canizares, Priscilla

    2011-01-01

    The computation of the self-force constitutes one of the main challenges for the construction of precise theoretical waveform templates in order to detect and analyze extreme-mass-ratio inspirals with the future space-based gravitational-wave observatory LISA. Since the number of templates required is quite high, it is important to develop fast algorithms both for the computation of the self-force and the production of waveforms. In this article we show how to tune a recent time-domain technique for the computation of the self-force, what we call the Particle without Particle scheme, in order to make it very precise and at the same time very efficient. We also extend this technique in order to allow for highly eccentric orbits.

  10. Tuning time-domain pseudospectral computations of the self-force on a charged scalar particle

    Energy Technology Data Exchange (ETDEWEB)

    Canizares, Priscilla; Sopuerta, Carlos F, E-mail: pcm@ieec.uab.es, E-mail: sopuerta@ieec.uab.es [Facultat de Ciencies, Institut de Ciencies de I' Espai (CSIC-IEEC), Campus UAB, Torre C5 parells, Bellaterra, 08193 Barcelona (Spain)

    2011-07-07

    The computation of the self-force constitutes one of the main challenges for the construction of precise theoretical waveform templates in order to detect and analyze extreme-mass-ratio inspirals with the future space-based gravitational-wave observatory LISA. Since the number of templates required is quite high, it is important to develop fast algorithms both for the computation of the self-force and the production of waveforms. In this paper, we show how to tune a recent time-domain technique for the computation of the self-force, what we call the particle without particle scheme, in order to make it very precise and at the same time very efficient. We also extend this technique in order to allow for highly eccentric orbits.

  11. Ethics, big data and computing in epidemiology and public health.

    Science.gov (United States)

    Salerno, Jennifer; Knoppers, Bartha M; Lee, Lisa M; Hlaing, WayWay M; Goodman, Kenneth W

    2017-05-01

    This article reflects on the activities of the Ethics Committee of the American College of Epidemiology (ACE). Members of the Ethics Committee identified an opportunity to elaborate on knowledge gained since the inception of the original Ethics Guidelines published by the ACE Ethics and Standards of Practice Committee in 2000. The ACE Ethics Committee presented a symposium session at the 2016 Epidemiology Congress of the Americas in Miami on the evolving complexities of ethics and epidemiology as it pertains to "big data." This article presents a summary and further discussion of that symposium session. Three topic areas were presented: the policy implications of big data and computing, the fallacy of "secondary" data sources, and the duty of citizens to contribute to big data. A balanced perspective is needed that provides safeguards for individuals but also furthers research to improve population health. Our in-depth review offers next steps for teaching of ethics and epidemiology, as well as for epidemiological research, public health practice, and health policy. To address contemporary topics in the area of ethics and epidemiology, the Ethics Committee hosted a symposium session on the timely topic of big data. Technological advancements in clinical medicine and genetic epidemiology research coupled with rapid advancements in data networks, storage, and computation at a lower cost are resulting in the growth of huge data repositories. Big data increases concerns about data integrity; informed consent; protection of individual privacy, confidentiality, and harm; data reidentification; and the reporting of faulty inferences. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. The Creative Commons. A Third Way between Public Domain and Community ?

    Directory of Open Access Journals (Sweden)

    Clément Bert-Erboul

    2016-04-01

    Full Text Available How have the Creative Commons’ ownership rules used by free websites like Wikipedia or Flickr and in 3D printer projects or in alternative kitchen gardens, been develo­ped? Internet users and technological experimentation communities rely heavily on these free tools, but the ideologies of the public domain and online communities that allowed their birth often remain obscure. In this article we used American legal doctri­ne, the scientific literature and specialized press archives. From these sources we analyzed the links between Copyright reforms and the institutionalization of the activity of free software developers, at the origin of Creative Commons licenses. The case of intangible goods property applied to tangibles goods shows how the community members and institutions legitimize their IT practices by means of several producers of norms, such as States or communities.

  13. Systematic analysis of public domain compound potency data identifies selective molecular scaffolds across druggable target families.

    Science.gov (United States)

    Hu, Ye; Wassermann, Anne Mai; Lounkine, Eugen; Bajorath, Jürgen

    2010-01-28

    Molecular scaffolds that yield target family-selective compounds are of high interest in pharmaceutical research. There continues to be considerable debate in the field as to whether chemotypes with a priori selectivity for given target families and/or targets exist and how they might be identified. What do currently available data tell us? We present a systematic and comprehensive selectivity-centric analysis of public domain target-ligand interactions. More than 200 molecular scaffolds are identified in currently available active compounds that are selective for established target families. A subset of these scaffolds is found to produce compounds with high selectivity for individual targets among closely related ones. These scaffolds are currently underrepresented in approved drugs.

  14. The Research Foci of Computing Research in South Africa as Reflected by Publications in the South African Computer Journal

    Directory of Open Access Journals (Sweden)

    Paula Kotze

    2010-07-01

    Full Text Available The South African Computer Journal, better known as SACJ, has, for the last nineteen years, been one of the most pertinent publications for the computing discipline within the South African milieu. In this paper we reflect on the topics of research articles published in SACJ over its first 40 volumes of the journal using the ACM Computing Classification Scheme as basis. In our analysis we divided the publications into three cycles of more or less six years in order to identify significant trends over the history of the journal. We also used the same classification scheme to analyse the publication trends of various South African tertiary education and research institutions.

  15. Generalized computer-aided discrete time domain modeling and analysis of dc-dc converters

    Science.gov (United States)

    Lee, F. C.; Iwens, R. P.; Yu, Y.; Triner, J. E.

    1977-01-01

    A generalized discrete time domain modeling and analysis technique is presented for all types of switching regulators using any type of duty-cycle controller, and operating in both continuous and discontinuous inductor current. State space techniques are employed to derive an equivalent nonlinear discrete time model that describes the converter exactly. The system is linearized about its equilibrium state to obtain a linear discrete time model for small signal performance evaluations, such as stability, audiosusceptibility and transient response. The analysis makes extensive use of the digital computer as an analytical tool. It is universal, exact and easy to use.

  16. Domain-general neural computations underlying prosociality during infancy and early childhood.

    Science.gov (United States)

    Cowell, Jason M; Calma-Birling, Destany; Decety, Jean

    2017-08-12

    A mounting body of neuroscience research in the social and moral evaluative abilities of infants and young children suggests the coopting of three domain-general processes involved in attention allocation, approach/avoidance, and intention and action understanding. Electrophysiological investigations demonstrate children's preference for prosocial others, that children's individual differences in moral evaluation predict prosocial behaviors, and that parental values may already influence neural sociomoral computations at quite young ages. This review highlights the importance of a developmental neuroscience approach in clarifying our understanding of early prosocial preference and behavior. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Computational helioseismology in the frequency domain: acoustic waves in axisymmetric solar models with flows

    Science.gov (United States)

    Gizon, Laurent; Barucq, Hélène; Duruflé, Marc; Hanson, Chris S.; Leguèbe, Michael; Birch, Aaron C.; Chabassier, Juliette; Fournier, Damien; Hohage, Thorsten; Papini, Emanuele

    2017-03-01

    Context. Local helioseismology has so far relied on semi-analytical methods to compute the spatial sensitivity of wave travel times to perturbations in the solar interior. These methods are cumbersome and lack flexibility. Aims: Here we propose a convenient framework for numerically solving the forward problem of time-distance helioseismology in the frequency domain. The fundamental quantity to be computed is the cross-covariance of the seismic wavefield. Methods: We choose sources of wave excitation that enable us to relate the cross-covariance of the oscillations to the Green's function in a straightforward manner. We illustrate the method by considering the 3D acoustic wave equation in an axisymmetric reference solar model, ignoring the effects of gravity on the waves. The symmetry of the background model around the rotation axis implies that the Green's function can be written as a sum of longitudinal Fourier modes, leading to a set of independent 2D problems. We use a high-order finite-element method to solve the 2D wave equation in frequency space. The computation is embarrassingly parallel, with each frequency and each azimuthal order solved independently on a computer cluster. Results: We compute travel-time sensitivity kernels in spherical geometry for flows, sound speed, and density perturbations under the first Born approximation. Convergence tests show that travel times can be computed with a numerical precision better than one millisecond, as required by the most precise travel-time measurements. Conclusions: The method presented here is computationally efficient and will be used to interpret travel-time measurements in order to infer, e.g., the large-scale meridional flow in the solar convection zone. It allows the implementation of (full-waveform) iterative inversions, whereby the axisymmetric background model is updated at each iteration.

  18. Final Report, DE-FG01-06ER25718 Domain Decomposition and Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    Widlund, Olof B. [New York Univ. (NYU), NY (United States). Courant Inst.

    2015-06-09

    The goal of this project is to develop and improve domain decomposition algorithms for a variety of partial differential equations such as those of linear elasticity and electro-magnetics.These iterative methods are designed for massively parallel computing systems and allow the fast solution of the very large systems of algebraic equations that arise in large scale and complicated simulations. A special emphasis is placed on problems arising from Maxwell's equation. The approximate solvers, the preconditioners, are combined with the conjugate gradient method and must always include a solver of a coarse model in order to have a performance which is independent of the number of processors used in the computer simulation. A recent development allows for an adaptive construction of this coarse component of the preconditioner.

  19. Computational Complexity Reduction of Synthetic-aperture Focus in Ultrasound Imaging Using Frequency-domain Reconstruction.

    Science.gov (United States)

    Moghimirad, Elahe; Mahloojifar, Ali; Mohammadzadeh Asl, Babak

    2016-05-01

    A new frequency-domain implementation of a synthetic aperture focusing technique is presented in the paper. The concept is based on synthetic aperture radar (SAR) and sonar that is a developed version of the convolution model in the frequency domain. Compared with conventional line-by-line imaging, synthetic aperture imaging has a better resolution and contrast at the cost of more computational load. To overcome this problem, point-by-point reconstruction methods have been replaced by block-processing algorithms in radar and sonar; however, these techniques are relatively unknown in medical imaging. In this paper, we extended one of these methods called wavenumber to medical ultrasound imaging using a simple model of synthetic aperture focus. The model, derived here for monostatic mode, can be generalized to multistatic as well. The method consists of 4 steps: a 2D fast Fourier transform of the data, frequency shift of the data to baseband, interpolation to convert polar coordinates to rectangular ones, and returning the data to the spatial-domain using a 2D inverse Fourier transform. We have also used chirp pulse excitation followed by matched filtering and spotlighting algorithm to compensate the effect of differences in parameters between radar and medical imaging. Computational complexities of the two methods, wavenumber and delay-and-sum (DAS), have been calculated. Field II simulated point data have been used to evaluate the results in terms of resolution and contrast. Evaluations with simulated data show that for typical phantoms, reconstruction by the wavenumber algorithm is almost 20 times faster than classical DAS while retaining the resolution. © The Author(s) 2015.

  20. Public policy and regulatory implications for the implementation of Opportunistic Cloud Computing Services for Enterprises

    DEFF Research Database (Denmark)

    Kuada, Eric; Olesen, Henning; Henten, Anders

    2012-01-01

    Opportunistic Cloud Computing Services (OCCS) is a social network approach to the provisioning and management of cloud computing services for enterprises. This paper discusses how public policy and regulations will impact on OCCS implementation. We rely on documented publicly available government...

  1. Research foci of computing research in South Africa as reflected by publications in the South African computer journal

    CSIR Research Space (South Africa)

    Kotzé, P

    2009-01-01

    Full Text Available The South African Computer Journal, better known as SACJ, has, for the last nineteen years, been one of the most pertinent publications for the computing discipline within the South African milieu. In this paper the authors reflect on the topics...

  2. A method for improving the computational efficiency of a Laplace-Fourier domain waveform inversion based on depth estimation

    Science.gov (United States)

    Zhang, Dong; Zhang, Xiaolei; Yuan, Jianzheng; Ke, Rui; Yang, Yan; Hu, Ying

    2016-01-01

    The Laplace-Fourier domain full waveform inversion can simultaneously restore both the long and intermediate short-wavelength information of velocity models because of its unique characteristics of complex frequencies. This approach solves the problem of conventional frequency-domain waveform inversion in which the inversion result is excessively dependent on the initial model due to the lack of low frequency information in seismic data. Nevertheless, the Laplace-Fourier domain waveform inversion requires substantial computational resources and long computation time because the inversion must be implemented on different combinations of multiple damping constants and multiple frequencies, namely, the complex frequencies, which are much more numerous than the Fourier frequencies. However, if the entire target model is computed on every complex frequency for the Laplace-Fourier domain inversion (as in the conventional frequency domain inversion), excessively redundant computation will occur. In the Laplace-Fourier domain waveform inversion, the maximum depth penetrated by the seismic wave decreases greatly due to the application of exponential damping to the seismic record, especially with use of a larger damping constant. Thus, the depth of the area effectively inverted on a complex frequency tends to be much less than the model depth. In this paper, we propose a method for quantitative estimation of the effective inversion depth in the Laplace-Fourier domain inversion based on the principle of seismic wave propagation and mathematical analysis. According to the estimated effective inversion depth, we can invert and update only the model area above the effective depth for every complex frequency without loss of accuracy in the final inversion result. Thus, redundant computation is eliminated, and the efficiency of the Laplace-Fourier domain waveform inversion can be improved. The proposed method was tested in numerical experiments. The experimental results show that

  3. Public domain small-area cancer incidence data for New York State, 2005-2009

    Directory of Open Access Journals (Sweden)

    Francis P. Boscoe

    2016-04-01

    Full Text Available There has long been a demand for cancer incidence data at a fine geographic resolution for use in etiologic hypothesis generation and testing, methodological evaluation and teaching. In this paper we describe a public domain dataset containing data for 23 anatomic sites of cancer diagnosed in New York State, USA between 2005 and 2009 at the census block group level. The dataset includes 524,503 tumours distributed across 13,823 block groups with an average population of about 1400. In addition, the data have been linked with race/ethnicity and with socioeconomic indicators such as income, educational attainment and language proficiency. We demonstrate the application of the dataset by confirming two well-established relationships: that between breast cancer and median household income and that between stomach cancer and Asian race. We foresee that this dataset will serve as the basis for a wide range of spatial analyses and as a benchmark for evaluating spatial methods in the future.

  4. Preserving Madagascar's Natural Heritage: The Importance of Keeping the Island's Vertebrate Fossils in the Public Domain

    Directory of Open Access Journals (Sweden)

    Karen E. Samonds

    2006-12-01

    Full Text Available The origin of Madagascar’s highly endemic vertebrate fauna remains one of the great unsolved mysteries of natural history. From what landmasses did the basal stocks of this unique and imbalanced fauna come? When and how did the ancestral populations arrive on the island? How rapidly did they diversify, and why? The most direct means of addressing these questions, and other enigmas concerning the evolutionary and biogeographic history of Madagascar’s vertebrate fauna, is through discovery of fossils from a sequence of well-dated geological horizons. Many fossils relevant to these queries have been discovered by paleontologists in recent years ... but many more are being lost to commercial enterprises, both foreign and domestic, that have little or no regard for the scientific significance of fossils. The objectives of this essay are to 1 provide an overview of Madagascar’s vertebrate fossil record and its importance, 2 raise awareness concerning the illegal collection, exportation, and sale of vertebrate fossils, and 3 stress the importance of keeping vertebrate fossils from the island in the public domain. In light of these issues, we underscore the necessity for development of adequate repositories and support infrastructure in Madagascar to safeguard and display the country’s vertebrate fossil collections; doing so would ensure the preservation and appreciation of Madagascar’s rich natural heritage for future generations of scientists and Malagasy citizens alike.

  5. A simple technique for morphological measurement of cerebral arterial circle variations using public domain software (Osiris).

    Science.gov (United States)

    Ansari, Saeed; Dadmehr, Majid; Eftekhar, Behzad; McConnell, Douglas J; Ganji, Sarah; Azari, Hassan; Kamali-Ardakani, Shahab; Hoh, Brian L; Mocco, J

    2011-12-01

    This article describes a straightforward method to measure the dimensions and identify morphological variations in the cerebral arterial circle using the general-purpose software program Osiris. This user-friendly and portable program displays, manipulates, and analyzes medical digital images, and it has the capability to determine morphometric properties of selected blood vessels (or other anatomical structures) in humans and animals. To ascertain morphometric variations in the cerebral arterial circle, 132 brains of recently deceased fetuses, infants, and adults were dissected. The dissection procedure was first digitized, and then the dimensions were measured with Osiris software. Measurements of each vessel's length and external diameters were used to identify and classify morphological variations in the cerebral arterial circle. The most commonly observed anatomical variations were uni- and bilateral hypoplasia of the posterior communicating artery. This study demonstrates that public domain software can be used to measure and classify cerebral arterial circle vessels. This method could be extended to examine other anatomical regions or to study other animals. Additionally, knowledge of variations within the circle could be applied clinically to enhance diagnostic and treatment specificity.

  6. 77 FR 74829 - Notice of Public Meeting-Cloud Computing and Big Data Forum and Workshop

    Science.gov (United States)

    2012-12-18

    ... National Institute of Standards and Technology Notice of Public Meeting--Cloud Computing and Big Data Forum...) announces a Cloud Computing and Big Data Forum and Workshop to be held on Tuesday, January 15, Wednesday... workshop. The NIST Cloud Computing and Big Data Forum and Workshop will bring together leaders and...

  7. 77 FR 26509 - Notice of Public Meeting-Cloud Computing Forum & Workshop V

    Science.gov (United States)

    2012-05-04

    ... National Institute of Standards and Technology Notice of Public Meeting--Cloud Computing Forum & Workshop V... announces the Cloud Computing Forum & Workshop V to be held on Tuesday, Wednesday and Thursday, June 5, 6... provide information on the U.S. Government (USG) Cloud Computing Technology Roadmap initiative. This...

  8. 76 FR 62373 - Notice of Public Meeting-Cloud Computing Forum & Workshop IV

    Science.gov (United States)

    2011-10-07

    ... National Institute of Standards and Technology Notice of Public Meeting--Cloud Computing Forum & Workshop...: NIST announces the Cloud Computing Forum & Workshop IV to be held on November 2, 3 and 4, 2011. This workshop will provide information on the U.S. Government (USG) Cloud Computing Technology Roadmap...

  9. Prospects for Finite-Difference Time-Domain (FDTD) Computational Electrodynamics

    Science.gov (United States)

    Taflove, Allen

    2002-08-01

    FDTD is the most powerful numerical solution of Maxwell's equations for structures having internal details. Relative to moment-method and finite-element techniques, FDTD can accurately model such problems with 100-times more field unknowns and with nonlinear and/or time-variable parameters. Hundreds of FDTD theory and applications papers are published each year. Currently, there are at least 18 commercial FDTD software packages for solving problems in: defense (especially vulnerability to electromagnetic pulse and high-power microwaves); design of antennas and microwave devices/circuits; electromagnetic compatibility; bioelectromagnetics (especially assessment of cellphone-generated RF absorption in human tissues); signal integrity in computer interconnects; and design of micro-photonic devices (especially photonic bandgap waveguides, microcavities; and lasers). This paper explores emerging prospects for FDTD computational electromagnetics brought about by continuing advances in computer capabilities and FDTD algorithms. We conclude that advances already in place point toward the usage by 2015 of ultralarge-scale (up to 1E11 field unknowns) FDTD electromagnetic wave models covering the frequency range from about 0.1 Hz to 1E17 Hz. We expect that this will yield significant benefits for our society in areas as diverse as computing, telecommunications, defense, and public health and safety.

  10. Integrating Publicly Available Data to Generate Computationally Predicted Adverse Outcome Pathways for Fatty Liver.

    Science.gov (United States)

    Bell, Shannon M; Angrish, Michelle M; Wood, Charles E; Edwards, Stephen W

    2016-04-01

    Newin vitrotesting strategies make it possible to design testing batteries for large numbers of environmental chemicals. Full utilization of the results requires knowledge of the underlying biological networks and the adverse outcome pathways (AOPs) that describe the route from early molecular perturbations to an adverse outcome. Curation of a formal AOP is a time-intensive process and a rate-limiting step to designing these test batteries. Here, we describe a method for integrating publicly available data in order to generate computationally predicted AOP (cpAOP) scaffolds, which can be leveraged by domain experts to shorten the time for formal AOP development. A network-based workflow was used to facilitate the integration of multiple data types to generate cpAOPs. Edges between graph entities were identified through direct experimental or literature information, or computationally inferred using frequent itemset mining. Data from the TG-GATEs and ToxCast programs were used to channel large-scale toxicogenomics information into a cpAOP network (cpAOPnet) of over 20 000 relationships describing connections between chemical treatments, phenotypes, and perturbed pathways as measured by differential gene expression and high-throughput screening targets. The resulting fatty liver cpAOPnet is available as a resource to the community. Subnetworks of cpAOPs for a reference chemical (carbon tetrachloride, CCl4) and outcome (fatty liver) were compared with published mechanistic descriptions. In both cases, the computational approaches approximated the manually curated AOPs. The cpAOPnet can be used for accelerating expert-curated AOP development and to identify pathway targets that lack genomic markers or high-throughput screening tests. It can also facilitate identification of key events for designing test batteries and for classification and grouping of chemicals for follow up testing.

  11. EPA National Center for Computational Toxicology UPDATE (ICCVAM public forum)

    Science.gov (United States)

    A presentation to the ICCVAM Public Forum on several new and exciting activities at NCCT, including Chemical library update, Chemistry Dashboard, Retrofitting in vitro assays with metabolic competence and In vitro PK.

  12. A 2D Time Domain DRBEM Computer Model for MagnetoThermoelastic Coupled Wave Propagation Problems

    Directory of Open Access Journals (Sweden)

    Mohamed Abdelsabour Fahmy

    2014-07-01

    Full Text Available A numerical computer model based on the dual reciprocity boundary element method (DRBEM is extended to study magneto-thermoelastic coupled wave propagation problems with relaxation times involving anisotropic functionally graded solids. The model formulation is tested through its application to the problem of a solid placed in a constant primary magnetic field acting in the direction of the z-axis and rotating about this axis with a constant angular velocity. In the case of two-dimensional deformation, an implicit-explicit time domain DRBEM was presented and implemented to obtain the solution for the displacement and temperature fields. A comparison of the results is presented graphically in the context of Lord and Shulman (LS and Green and Lindsay (GL theories. Numerical results that demonstrate the validity of the proposed method are also presented graphically.

  13. Domain decomposition parallel computing for transient two-phase flow of nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jae Ryong; Yoon, Han Young [KAERI, Daejeon (Korea, Republic of); Choi, Hyoung Gwon [Seoul National University, Seoul (Korea, Republic of)

    2016-05-15

    KAERI (Korea Atomic Energy Research Institute) has been developing a multi-dimensional two-phase flow code named CUPID for multi-physics and multi-scale thermal hydraulics analysis of Light water reactors (LWRs). The CUPID code has been validated against a set of conceptual problems and experimental data. In this work, the CUPID code has been parallelized based on the domain decomposition method with Message passing interface (MPI) library. For domain decomposition, the CUPID code provides both manual and automatic methods with METIS library. For the effective memory management, the Compressed sparse row (CSR) format is adopted, which is one of the methods to represent the sparse asymmetric matrix. CSR format saves only non-zero value and its position (row and column). By performing the verification for the fundamental problem set, the parallelization of the CUPID has been successfully confirmed. Since the scalability of a parallel simulation is generally known to be better for fine mesh system, three different scales of mesh system are considered: 40000 meshes for coarse mesh system, 320000 meshes for mid-size mesh system, and 2560000 meshes for fine mesh system. In the given geometry, both single- and two-phase calculations were conducted. In addition, two types of preconditioners for a matrix solver were compared: Diagonal and incomplete LU preconditioner. In terms of enhancement of the parallel performance, the OpenMP and MPI hybrid parallel computing for a pressure solver was examined. It is revealed that the scalability of hybrid calculation was enhanced for the multi-core parallel computation.

  14. Computational design of a PDZ domain peptide inhibitor that rescues CFTR activity.

    Directory of Open Access Journals (Sweden)

    Kyle E Roberts

    Full Text Available The cystic fibrosis transmembrane conductance regulator (CFTR is an epithelial chloride channel mutated in patients with cystic fibrosis (CF. The most prevalent CFTR mutation, ΔF508, blocks folding in the endoplasmic reticulum. Recent work has shown that some ΔF508-CFTR channel activity can be recovered by pharmaceutical modulators ("potentiators" and "correctors", but ΔF508-CFTR can still be rapidly degraded via a lysosomal pathway involving the CFTR-associated ligand (CAL, which binds CFTR via a PDZ interaction domain. We present a study that goes from theory, to new structure-based computational design algorithms, to computational predictions, to biochemical testing and ultimately to epithelial-cell validation of novel, effective CAL PDZ inhibitors (called "stabilizers" that rescue ΔF508-CFTR activity. To design the "stabilizers", we extended our structural ensemble-based computational protein redesign algorithm K* to encompass protein-protein and protein-peptide interactions. The computational predictions achieved high accuracy: all of the top-predicted peptide inhibitors bound well to CAL. Furthermore, when compared to state-of-the-art CAL inhibitors, our design methodology achieved higher affinity and increased binding efficiency. The designed inhibitor with the highest affinity for CAL (kCAL01 binds six-fold more tightly than the previous best hexamer (iCAL35, and 170-fold more tightly than the CFTR C-terminus. We show that kCAL01 has physiological activity and can rescue chloride efflux in CF patient-derived airway epithelial cells. Since stabilizers address a different cellular CF defect from potentiators and correctors, our inhibitors provide an additional therapeutic pathway that can be used in conjunction with current methods.

  15. Excellence in Computational Biology and Informatics — EDRN Public Portal

    Science.gov (United States)

    9th Early Detection Research Network (EDRN) Scientific Workshop. Excellence in Computational Biology and Informatics: Sponsored by the EDRN Data Sharing Subcommittee Moderator: Daniel Crichton, M.S., NASA Jet Propulsion Laboratory

  16. Proposed standards for peer-reviewed publication of computer code

    Science.gov (United States)

    Computer simulation models are mathematical abstractions of physical systems. In the area of natural resources and agriculture, these physical systems encompass selected interacting processes in plants, soils, animals, or watersheds. These models are scientific products and have become important i...

  17. The public understanding of nanotechnology in the food domain: the hidden role of views on science, technology, and nature.

    Science.gov (United States)

    Vandermoere, Frederic; Blanchemanche, Sandrine; Bieberstein, Andrea; Marette, Stephan; Roosen, Jutta

    2011-03-01

    In spite of great expectations about the potential of nanotechnology, this study shows that people are rather ambiguous and pessimistic about nanotechnology applications in the food domain. Our findings are drawn from a survey of public perceptions about nanotechnology food and nanotechnology food packaging (N = 752). Multinomial logistic regression analyses further reveal that knowledge about food risks and nanotechnology significantly influences people's views about nanotechnology food packaging. However, knowledge variables were unrelated to support for nanofood, suggesting that an increase in people's knowledge might not be sufficient to bridge the gap between the excitement some business leaders in the food sector have and the restraint of the public. Additionally, opposition to nanofood was not related to the use of heuristics but to trust in governmental agencies. Furthermore, the results indicate that public perceptions of nanoscience in the food domain significantly relate to views on science, technology, and nature.

  18. The Explicit Computations of the Bergman Kernels on Generalized Hua Domains

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    @@The Bergman kernel function plays an important role in several complex variables. There exists the Bergman kernel function on any bounded domain in Cn. But we can get the Bergman kernel functions in explicit formulas for a few types of domains only, for instance, the bounded homogeneous domains and the egg domains in some cases.

  19. Modeling and Analysis Compute Environments, Utilizing Virtualization Technology in the Climate and Earth Systems Science domain

    Science.gov (United States)

    Michaelis, A.; Nemani, R. R.; Wang, W.; Votava, P.; Hashimoto, H.

    2010-12-01

    Given the increasing complexity of climate modeling and analysis tools, it is often difficult and expensive to build or recreate an exact replica of the software compute environment used in past experiments. With the recent development of new technologies for hardware virtualization, an opportunity exists to create full modeling, analysis and compute environments that are “archiveable”, transferable and may be easily shared amongst a scientific community or presented to a bureaucratic body if the need arises. By encapsulating and entire modeling and analysis environment in a virtual machine image, others may quickly gain access to the fully built system used in past experiments, potentially easing the task and reducing the costs of reproducing and verify past results produced by other researchers. Moreover, these virtual machine images may be used as a pedagogical tool for others that are interested in performing an academic exercise but don't yet possess the broad expertise required. We built two virtual machine images, one with the Community Earth System Model (CESM) and one with Weather Research Forecast Model (WRF), then ran several small experiments to assess the feasibility, performance overheads costs, reusability, and transferability. We present a list of the pros and cons as well as lessoned learned from utilizing virtualization technology in the climate and earth systems modeling domain.

  20. Computational helioseismology in the frequency domain: acoustic waves in axisymmetric solar models with flows

    CERN Document Server

    Gizon, Laurent; Duruflé, Marc; Hanson, Chris S; Leguèbe, Michael; Birch, Aaron C; Chabassier, Juliette; Fournier, Damien; Hohage, Thorsten; Papini, Emanuele

    2016-01-01

    Local helioseismology has so far relied on semi-analytical methods to compute the spatial sensitivity of wave travel times to perturbations in the solar interior. These methods are cumbersome and lack flexibility. Here we propose a convenient framework for numerically solving the forward problem of time-distance helioseismology in the frequency domain. The fundamental quantity to be computed is the cross-covariance of the seismic wavefield. We choose sources of wave excitation that enable us to relate the cross-covariance of the oscillations to the Green's function in a straightforward manner. We illustrate the method by considering the 3D acoustic wave equation in an axisymmetric reference solar model, ignoring the effects of gravity on the waves. The symmetry of the background model around the rotation axis implies that the Green's function can be written as a sum of longitudinal Fourier modes, leading to a set of independent 2D problems. We use a high-order finite-element method to solve the 2D wave equati...

  1. Exploring Symmetry as an Avenue to the Computational Design of Large Protein Domains

    Energy Technology Data Exchange (ETDEWEB)

    Fortenberry, Carie; Bowman, Elizabeth Anne; Proffitt, Will; Dorr, Brent; Combs, Steven; Harp, Joel; Mizoue, Laura; Meiler, Jens (Vanderbilt)

    2012-03-15

    It has been demonstrated previously that symmetric, homodimeric proteins are energetically favored, which explains their abundance in nature. It has been proposed that such symmetric homodimers underwent gene duplication and fusion to evolve into protein topologies that have a symmetric arrangement of secondary structure elements - 'symmetric superfolds'. Here, the ROSETTA protein design software was used to computationally engineer a perfectly symmetric variant of imidazole glycerol phosphate synthase and its corresponding symmetric homodimer. The new protein, termed FLR, adopts the symmetric ({beta}{alpha}){sub 8} TIM-barrel superfold. The protein is soluble and monomeric and exhibits two-fold symmetry not only in the arrangement of secondary structure elements but also in sequence and at atomic detail, as verified by crystallography. When cut in half, FLR dimerizes readily to form the symmetric homodimer. The successful computational design of FLR demonstrates progress in our understanding of the underlying principles of protein stability and presents an attractive strategy for the in silico construction of larger protein domains from smaller pieces.

  2. Exploratory analysis regarding the domain definitions for computer based analytical models

    Science.gov (United States)

    Raicu, A.; Oanta, E.; Barhalescu, M.

    2017-08-01

    Our previous computer based studies dedicated to structural problems using analytical methods defined the composite cross section of a beam as a result of Boolean operations with so-called ‘simple’ shapes. Using generalisations, in the class of the ‘simple’ shapes were included areas bounded by curves approximated using spline functions and areas approximated as polygons. However, particular definitions lead to particular solutions. In order to ascend above the actual limitations, we conceived a general definition of the cross sections that are considered now calculus domains consisting of several subdomains. The according set of input data use complex parameterizations. This new vision allows us to naturally assign a general number of attributes to the subdomains. In this way there may be modelled new phenomena that use map-wise information, such as the metal alloys equilibrium diagrams. The hierarchy of the input data text files that use the comma-separated-value format and their structure are also presented and discussed in the paper. This new approach allows us to reuse the concepts and part of the data processing software instruments already developed. The according software to be subsequently developed will be modularised and generalised in order to be used in the upcoming projects that require rapid development of computer based models.

  3. Communication and Computation Skills for Blind Students Attending Public Schools.

    Science.gov (United States)

    Suffolk County Board of Cooperative Educational Services 3, Dix Hills, NY.

    Outlined are evaluative and instructional procedures used by itinerant teachers of blind children in public schools to teach readiness for braille reading and writing, as well as braille reading and writing, signature writing, and the Nemeth Code of braille mathematics and scientific notation. Readiness for braille reading and writing is…

  4. Calculation method of reflectance distributions for computer-generated holograms using the finite-difference time-domain method.

    Science.gov (United States)

    Ichikawa, Tsubasa; Sakamoto, Yuji; Subagyo, Agus; Sueoka, Kazuhisa

    2011-12-01

    The research on reflectance distributions in computer-generated holograms (CGHs) is particularly sparse, and the textures of materials are not expressed. Thus, we propose a method for calculating reflectance distributions in CGHs that uses the finite-difference time-domain method. In this method, reflected light from an uneven surface made on a computer is analyzed by finite-difference time-domain simulation, and the reflected light distribution is applied to the CGH as an object light. We report the relations between the surface roughness of the objects and the reflectance distributions, and show that the reflectance distributions are given to CGHs by imaging simulation.

  5. Computers in Public Schools: Changing the Image with Image Processing.

    Science.gov (United States)

    Raphael, Jacqueline; Greenberg, Richard

    1995-01-01

    The kinds of educational technologies selected can make the difference between uninspired, rote computer use and challenging learning experiences. University of Arizona's Image Processing for Teaching Project has worked with over 1,000 teachers to develop image-processing techniques that provide students with exciting, open-ended opportunities for…

  6. The ACLS Survey of Scholars: Views on Publications, Computers, Libraries.

    Science.gov (United States)

    Morton, Herbert C.; Price, Anne Jamieson

    1986-01-01

    Reviews results of a survey by the American Council of Learned Societies (ACLS) of 3,835 scholars in the humanities and social sciences who are working both in colleges and universities and outside the academic community. Areas highlighted include professional reading, authorship patterns, computer use, and library use. (LRW)

  7. Moving domain computational fluid dynamics to interface with an embryonic model of cardiac morphogenesis.

    Directory of Open Access Journals (Sweden)

    Juhyun Lee

    Full Text Available Peristaltic contraction of the embryonic heart tube produces time- and spatial-varying wall shear stress (WSS and pressure gradients (∇P across the atrioventricular (AV canal. Zebrafish (Danio rerio are a genetically tractable system to investigate cardiac morphogenesis. The use of Tg(fli1a:EGFP (y1 transgenic embryos allowed for delineation and two-dimensional reconstruction of the endocardium. This time-varying wall motion was then prescribed in a two-dimensional moving domain computational fluid dynamics (CFD model, providing new insights into spatial and temporal variations in WSS and ∇P during cardiac development. The CFD simulations were validated with particle image velocimetry (PIV across the atrioventricular (AV canal, revealing an increase in both velocities and heart rates, but a decrease in the duration of atrial systole from early to later stages. At 20-30 hours post fertilization (hpf, simulation results revealed bidirectional WSS across the AV canal in the heart tube in response to peristaltic motion of the wall. At 40-50 hpf, the tube structure undergoes cardiac looping, accompanied by a nearly 3-fold increase in WSS magnitude. At 110-120 hpf, distinct AV valve, atrium, ventricle, and bulbus arteriosus form, accompanied by incremental increases in both WSS magnitude and ∇P, but a decrease in bi-directional flow. Laminar flow develops across the AV canal at 20-30 hpf, and persists at 110-120 hpf. Reynolds numbers at the AV canal increase from 0.07±0.03 at 20-30 hpf to 0.23±0.07 at 110-120 hpf (p< 0.05, n=6, whereas Womersley numbers remain relatively unchanged from 0.11 to 0.13. Our moving domain simulations highlights hemodynamic changes in relation to cardiac morphogenesis; thereby, providing a 2-D quantitative approach to complement imaging analysis.

  8. Moving domain computational fluid dynamics to interface with an embryonic model of cardiac morphogenesis.

    Science.gov (United States)

    Lee, Juhyun; Moghadam, Mahdi Esmaily; Kung, Ethan; Cao, Hung; Beebe, Tyler; Miller, Yury; Roman, Beth L; Lien, Ching-Ling; Chi, Neil C; Marsden, Alison L; Hsiai, Tzung K

    2013-01-01

    Peristaltic contraction of the embryonic heart tube produces time- and spatial-varying wall shear stress (WSS) and pressure gradients (∇P) across the atrioventricular (AV) canal. Zebrafish (Danio rerio) are a genetically tractable system to investigate cardiac morphogenesis. The use of Tg(fli1a:EGFP) (y1) transgenic embryos allowed for delineation and two-dimensional reconstruction of the endocardium. This time-varying wall motion was then prescribed in a two-dimensional moving domain computational fluid dynamics (CFD) model, providing new insights into spatial and temporal variations in WSS and ∇P during cardiac development. The CFD simulations were validated with particle image velocimetry (PIV) across the atrioventricular (AV) canal, revealing an increase in both velocities and heart rates, but a decrease in the duration of atrial systole from early to later stages. At 20-30 hours post fertilization (hpf), simulation results revealed bidirectional WSS across the AV canal in the heart tube in response to peristaltic motion of the wall. At 40-50 hpf, the tube structure undergoes cardiac looping, accompanied by a nearly 3-fold increase in WSS magnitude. At 110-120 hpf, distinct AV valve, atrium, ventricle, and bulbus arteriosus form, accompanied by incremental increases in both WSS magnitude and ∇P, but a decrease in bi-directional flow. Laminar flow develops across the AV canal at 20-30 hpf, and persists at 110-120 hpf. Reynolds numbers at the AV canal increase from 0.07±0.03 at 20-30 hpf to 0.23±0.07 at 110-120 hpf (p< 0.05, n=6), whereas Womersley numbers remain relatively unchanged from 0.11 to 0.13. Our moving domain simulations highlights hemodynamic changes in relation to cardiac morphogenesis; thereby, providing a 2-D quantitative approach to complement imaging analysis.

  9. Suppressing gate errors in frequency-domain quantum computation through extra physical systems coupled to a cavity

    Science.gov (United States)

    Nakamura, Satoshi; Goto, Hayato; Kujiraoka, Mamiko; Ichimura, Kouichi

    2016-12-01

    We propose a scheme for frequency-domain quantum computation (FDQC) in which the errors due to crosstalk are suppressed using extra physical systems coupled to a cavity. FDQC is a promising method to realize large-scale quantum computation, but crosstalk is a major problem. When physical systems employed as qubits satisfy specific resonance conditions, gate errors due to crosstalk increase. In our scheme, the errors are suppressed by controlling the resonance conditions using extra physical systems.

  10. The Use of Public Computing Facilities by Library Patrons: Demography, Motivations, and Barriers

    Science.gov (United States)

    DeMaagd, Kurt; Chew, Han Ei; Huang, Guanxiong; Khan, M. Laeeq; Sreenivasan, Akshaya; LaRose, Robert

    2013-01-01

    Public libraries play an important part in the development of a community. Today, they are seen as more than store houses of books; they are also responsible for the dissemination of online, and offline information. Public access computers are becoming increasingly popular as more and more people understand the need for internet access. Using a…

  11. 版权法上的公共领域研究%On public domain in copyright law

    Institute of Scientific and Technical Information of China (English)

    黄汇

    2009-01-01

    Public domain is a core rule of copyright law, under which various creative materials are available for an author to use without charge or liability for infringement, hence ensuring the effective implementation of copyright law.Public domain is characterized by openness, public ownership, irrevocability and formality. Based on the premise that the author's work will not be interfered, public domain ultimately aims at the enlargement of its own universe and prosperity of the culture of human society. Its introduction into copyright law satisfies both historical and logical demands. Without its acknowledgement,copyright cannot be justified. In that sense, public domain and copyright can be deemed as twins. Public domain is not only an existing institution, but also an ideological tendency or a methodology. It has evaluative and inspective values towards copyright. It is an important precondition of copy-rights, and what is more, an important measure for controlling the expansion and realizing the purpose of copyright.%公共领域是版权法的核心,它是保证作者有效运用各种创作素材从而使版权的其余部分得以良好运转的工具.公共领域具有开放性、有主性、不可撤销性和程序性等特征,它以保证作者的创作为前提,却最终以自身的不断扩大和人类社会的文化繁衍为依归.公共领域在版权法上的生成既是历史的,更是逻辑的.没有公共领域的被承认,也就没有版权的正当性可言,因此公共领域和版权实际上一同诞生.公共领域不仅是一种制度存在物,它更是一种思想倾向和方法论,公共领域对版权具有评价和检视功能,它既是版权运行的重要前提,又是控制版权扩张和实现版权目的的重要手段.

  12. A Computer-Assisted Instruction in Teaching Abstract Statistics to Public Affairs Undergraduates

    Science.gov (United States)

    Ozturk, Ali Osman

    2012-01-01

    This article attempts to demonstrate the applicability of a computer-assisted instruction supported with simulated data in teaching abstract statistical concepts to political science and public affairs students in an introductory research methods course. The software is called the Elaboration Model Computer Exercise (EMCE) in that it takes a great…

  13. Effects of the computational domain size on DNS of Taylor-Couette turbulence

    CERN Document Server

    Mónico, Rodolfo Ostilla; Lohse, Detlef

    2014-01-01

    In search for the cheapest but still reliable numerical simulation, a systematic study on the effect of the computational domain ("box") size on direct numerical simulations of Taylor-Couette flow was performed. Four boxes, with varying azimuthal and axial extents were used. The radius ratio between the inner cylinder and the outer cylinder was fixed to $\\eta=r_i/r_o=0.909$, and the outer was kept stationary, while the inner rotated at a Reynolds number $Re_i=10^5$. Profiles of mean and fluctuation velocities are compared, as well as autocorrelations and velocity spectra. The smallest box is found to accurately reproduce the torque and mean azimuthal velocity profiles of larger boxes, while having smaller values of the fluctuations than the larger boxes. The axial extent of the box directly reflects on the Taylor-rolls and plays a crucial role on the correlations and spectra. The azimuthal extent is also found to play a significant role, as larger boxes allow for azimuthal wave-like patterns in the Taylor rol...

  14. A rough-granular computing in discovery of process models from data and domain knowledge

    Institute of Scientific and Technical Information of China (English)

    NGUYEN Hung Son; SKOWRON Andrzej

    2008-01-01

    The rapid expansion of the Internet has resulted not only in the ever growing amount of data therein stored, but also in the burgeoning complexity of the concepts and phenomena pertaining to those data. This issue has been vividly com- pared by the renowned statistician, prof. Friedman of Stanford University, to the advances in human mobility from the pe- riod of walking afoot to the era of jet travel. These essential changes in data have brought new challenges to the develop- ment of new data mining methods, especially that the treatment of these data increasingly involves complex processes that e- lude classic modeling paradigms. "Hot" datasets like biomedical, financial or net user behavior data are just a few examples. Mining such temporal or stream data is on the agenda of many research centers and companies worldwide. In the data min- ing community, there is a rapidly growing interest in developing methods for process mining, e. g. , for discovery of struc- tures of temporal processes from data. Works on process mining have recently been undertaken by many renowned centers worldwide. This research is also related to functional data analysis , cognitive networks , and dynamical system modeling, e. g. , in biology. In the lecture, we outline an approach to discovery of processes from data and domain knowledge which is based on the rough-granular computing.

  15. Time Is Not Space: Core Computations and Domain-Specific Networks for Mental Travels.

    Science.gov (United States)

    Gauthier, Baptiste; van Wassenhove, Virginie

    2016-11-23

    Humans can consciously project themselves in the future and imagine themselves at different places. Do mental time travel and mental space navigation abilities share common cognitive and neural mechanisms? To test this, we recorded fMRI while participants mentally projected themselves in time or in space (e.g., 9 years ago, in Paris) and ordered historical events from their mental perspective. Behavioral patterns were comparable for mental time and space and shaped by self-projection and by the distance of historical events to the mental position of the self, suggesting the existence of egocentric mapping in both dimensions. Nonetheless, self-projection in space engaged the medial and lateral parietal cortices, whereas self-projection in time engaged a widespread parietofrontal network. Moreover, while a large distributed network was found for spatial distances, temporal distances specifically engaged the right inferior parietal cortex and the anterior insula. Across these networks, a robust overlap was only found in a small region of the inferior parietal lobe, adding evidence for its role in domain-general egocentric mapping. Our findings suggest that mental travel in time or space capitalizes on egocentric remapping and on distance computation, which are implemented in distinct dimension-specific cortical networks converging in inferior parietal lobe.

  16. Stimulated Emission Computed Tomography (NSECT) images enhancement using a linear filter in the frequency domain

    Energy Technology Data Exchange (ETDEWEB)

    Viana, Rodrigo S.S.; Tardelli, Tiago C.; Yoriyaz, Helio, E-mail: hyoriyaz@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Jackowski, Marcel P., E-mail: mjack@ime.usp.b [University of Sao Paulo (USP), SP (Brazil). Dept. of Computer Science

    2011-07-01

    In recent years, a new technique for in vivo spectrographic imaging of stable isotopes was presented as Neutron Stimulated Emission Computed Tomography (NSECT). In this technique, a fast neutrons beam stimulates stable nuclei in a sample, which emit characteristic gamma radiation. The photon energy is unique and is used to identify the emitting nuclei. The emitted gamma energy spectra can be used for reconstruction of the target tissue image and for determination of the tissue elemental composition. Due to the stochastic nature of photon emission process by irradiated tissue, one of the most suitable algorithms for tomographic reconstruction is the Expectation-Maximization (E-M) algorithm, once on its formulation are considered simultaneously the probabilities of photons emission and detection. However, a disadvantage of this algorithm is the introduction of noise in the reconstructed image as the number of iterations increases. This increase can be caused either by features of the algorithm itself or by the low sampling rate of projections used for tomographic reconstruction. In this work, a linear filter in the frequency domain was used in order to improve the quality of the reconstructed images. (author)

  17. 36 CFR 1254.32 - What rules apply to public access use of the Internet on NARA-supplied computers?

    Science.gov (United States)

    2010-07-01

    ... access use of the Internet on NARA-supplied computers? 1254.32 Section 1254.32 Parks, Forests, and Public... of the Internet on NARA-supplied computers? (a) Public access computers (workstations) are available... use personally owned diskettes on NARA personal computers. You may not load files or any type of...

  18. A literature review of neck pain associated with computer use: public health implications

    OpenAIRE

    2008-01-01

    Prolonged use of computers during daily work activities and recreation is often cited as a cause of neck pain. This review of the literature identifies public health aspects of neck pain as associated with computer use. While some retrospective studies support the hypothesis that frequent computer operation is associated with neck pain, few prospective studies reveal causal relationships. Many risk factors are identified in the literature. Primary prevention strategies have largely been confi...

  19. Advances in time-domain electromagnetic simulation capabilities through the use of overset grids and massively parallel computing

    Science.gov (United States)

    Blake, Douglas Clifton

    A new methodology is presented for conducting numerical simulations of electromagnetic scattering and wave-propagation phenomena on massively parallel computing platforms. A process is constructed which is rooted in the Finite-Volume Time-Domain (FVTD) technique to create a simulation capability that is both versatile and practical. In terms of versatility, the method is platform independent, is easily modifiable, and is capable of solving a large number of problems with no alterations. In terms of practicality, the method is sophisticated enough to solve problems of engineering significance and is not limited to mere academic exercises. In order to achieve this capability, techniques are integrated from several scientific disciplines including computational fluid dynamics, computational electromagnetics, and parallel computing. The end result is the first FVTD solver capable of utilizing the highly flexible overset-gridding process in a distributed-memory computing environment. In the process of creating this capability, work is accomplished to conduct the first study designed to quantify the effects of domain-decomposition dimensionality on the parallel performance of hyperbolic partial differential equations solvers; to develop a new method of partitioning a computational domain comprised of overset grids; and to provide the first detailed assessment of the applicability of overset grids to the field of computational electromagnetics. Using these new methods and capabilities, results from a large number of wave propagation and scattering simulations are presented. The overset-grid FVTD algorithm is demonstrated to produce results of comparable accuracy to single-grid simulations while simultaneously shortening the grid-generation process and increasing the flexibility and utility of the FVTD technique. Furthermore, the new domain-decomposition approaches developed for overset grids are shown to be capable of producing partitions that are better load balanced and

  20. Computational Ecology and Software (http://www.iaees.org/publications/journals/ces/online-version.asp

    Directory of Open Access Journals (Sweden)

    ces@iaees.org

    Full Text Available Computational Ecology and Software ISSN 2220-721X URL: http://www.iaees.org/publications/journals/ces/online-version.asp RSS: http://www.iaees.org/publications/journals/ces/rss.xml E-mail: ces@iaees.org Editor-in-Chief: WenJun Zhang Aims and Scope COMPUTATIONAL ECOLOGY AND SOFTWARE (ISSN 2220-721X is an open access, peer-reviewed online journal that considers scientific articles in all different areas of computational ecology. It is the transactions of the International Society of Computational Ecology. The journal is concerned with the ecological researches, constructions and applications of theories and methods of computational sciences including computational mathematics, computational statistics and computer science. It features the simulation, approximation, prediction, recognition, and classification of ecological issues. Intensive computation is one of the major stresses of the journal. The journal welcomes research articles, short communications, review articles, perspectives, and book reviews. The journal also supports the activities of the International Society of Computational Ecology. The topics to be covered by CES include, but are not limited to: •Computation intensive methods, numerical and optimization methods, differential and difference equation modeling and simulation, prediction, recognition, classification, statistical computation (Bayesian computing, randomization, bootstrapping, Monte Carlo techniques, stochastic process, etc., agent-based modeling, individual-based modeling, artificial neural networks, knowledge based systems, machine learning, genetic algorithms, data exploration, network analysis and computation, databases, ecological modeling and computation using Geographical Information Systems, satellite imagery, and other computation intensive theories and methods. •Artificial ecosystems, artificial life, complexity of ecosystems and virtual reality. •The development, evaluation and validation of software and

  1. A comprehensive computational study on pathogenic mis-sense mutations spanning the RING2 and REP domains of Parkin protein.

    Science.gov (United States)

    Biswas, Ria; Bagchi, Angshuman

    2017-04-30

    Various mutations in PARK2 gene, which encodes the protein parkin, are significantly associated with the onset of autosomal recessive juvenile Parkinson (ARJP) in neuronal cells. Parkin is a multi domain protein, the N-terminal part contains the Ubl and the C-terminal part consists of four zinc coordinating domains, viz., RING0, RING1, in between ring (IBR) and RING2. Disease mutations are spread over all the domains of Parkin, although mutations in some regions may affect the functionality of Parkin more adversely. The mutations in the RING2 domain are seen to abolish the neuroprotective E3 ligase activity of Parkin. In this current work, we carried out detailed in silico analysis to study the extent of pathogenicity of mutations spanning the Parkin RING2 domain and the adjoining REP region by SIFT, Mutation Accessor, PolyPhen2, SNPs and GO, GV/GD and I-mutant. To study the structural and functional implications of these mutations on RING2-REP domain of Parkin, we studied the solvent accessibility (SASA/RSA), hydrophobicity, intra-molecular hydrogen bonding profile and domain analysis by various computational tools. Finally, we analysed the interaction energy profiles of the mutants and compared them to the wild type protein using Discovery studio 2.5. By comparing the various analyses it could be safely concluded that except P437L and A379V mutations, all other mutations were potentially deleterious affecting various structural aspects of RING2 domain architecture. This study is based purely on computational approach which has the potential to identify disease mutations and the information could further be used in treatment of diseases and prognosis. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. The Importance of Computer Science for Public Health Training: An Opportunity and Call to Action.

    Science.gov (United States)

    Kunkle, Sarah; Christie, Gillian; Yach, Derek; El-Sayed, Abdulrahman M

    2016-01-01

    A century ago, the Welch-Rose Report established a public health education system in the United States. Since then, the system has evolved to address emerging health needs and integrate new technologies. Today, personalized health technologies generate large amounts of data. Emerging computer science techniques, such as machine learning, present an opportunity to extract insights from these data that could help identify high-risk individuals and tailor health interventions and recommendations. As these technologies play a larger role in health promotion, collaboration between the public health and technology communities will become the norm. Offering public health trainees coursework in computer science alongside traditional public health disciplines will facilitate this evolution, improving public health's capacity to harness these technologies to improve population health.

  3. An Image-Domain Contrast Material Extraction Method for Dual-Energy Computed Tomography.

    Science.gov (United States)

    Lambert, Jack W; Sun, Yuxin; Gould, Robert G; Ohliger, Michael A; Li, Zhixi; Yeh, Benjamin M

    2017-04-01

    Conventional material decomposition techniques for dual-energy computed tomography (CT) assume mass or volume conservation, where the CT number of each voxel is fully assigned to predefined materials. We present an image-domain contrast material extraction process (CMEP) method that preferentially extracts contrast-producing materials while leaving the remaining image intact. Image processing freeware (Fiji) is used to perform consecutive arithmetic operations on a dual-energy ratio map to generate masks, which are then applied to the original images to generate material-specific images. First, a low-energy image is divided by a high-energy image to generate a ratio map. The ratio map is then split into material-specific masks. Ratio intervals known to correspond to particular materials (eg, iodine, calcium) are assigned a multiplier of 1, whereas ratio values in between these intervals are assigned linear gradients from 0 to 1. The masks are then multiplied by an original CT image to produce material-specific images. The method was tested quantitatively at dual-source CT and rapid kVp-switching CT (RSCT) with phantoms using pure and mixed formulations of tungsten, calcium, and iodine. Errors were evaluated by comparing the known material concentrations with those derived from the CMEP material-specific images. Further qualitative evaluation was performed in vivo at RSCT with a rabbit model using identical CMEP parameters to the phantom. Orally administered tungsten, vascularly administered iodine, and skeletal calcium were used as the 3 contrast materials. All 5 material combinations-tungsten, iodine, and calcium, and mixtures of tungsten-calcium and iodine-calcium-showed distinct dual-energy ratios, largely independent of material concentration at both dual-source CT and RSCT. The CMEP was successful in both phantoms and in vivo. For pure contrast materials in the phantom, the maximum error between the known and CMEP-derived material concentrations was 0.9 mg

  4. Time-Domain Techniques for Computation and Reconstruction of One-Dimensional Profiles

    Directory of Open Access Journals (Sweden)

    M. Rahman

    2005-01-01

    Full Text Available This paper presents a time-domain technique to compute the electromagnetic fields and to reconstruct the permittivity profile within a one-dimensional medium of finite length. The medium is characterized by a permittivity as well as conductivity profile which vary only with depth. The discussed scattering problem is thus one-dimensional. The modeling tool is divided into two different schemes which are named as the forward solver and the inverse solver. The task of the forward solver is to compute the internal fields of the specimen which is performed by Green’s function approach. When a known electromagnetic wave is incident normally on the media, the resulting electromagnetic field within the media can be calculated by constructing a Green’s operator. This operator maps the incident field on either side of the medium to the field at an arbitrary observation point. It is nothing but a matrix of integral operators with kernels satisfying known partial differential equations. The reflection and transmission behavior of the medium is also determined from the boundary values of the Green's operator. The inverse solver is responsible for solving an inverse scattering problem by reconstructing the permittivity profile of the medium. Though it is possible to use several algorithms to solve this problem, the invariant embedding method, also known as the layer-stripping method, has been implemented here due to the advantage that it requires a finite time trace of reflection data. Here only one round trip of reflection data is used, where one round trip is defined by the time required by the pulse to propagate through the medium and back again. The inversion process begins by retrieving the reflection kernel from the reflected wave data by simply using a deconvolution technique. The rest of the task can easily be performed by applying a numerical approach to determine different profile parameters. Both the solvers have been found to have the

  5. A Comparative Assessment of Computer Literacy of Private and Public Secondary School Students in Lagos State, Nigeria

    Science.gov (United States)

    Osunwusi, Adeyinka Olumuyiwa; Abifarin, Michael Segun

    2013-01-01

    The aim of this study was to conduct a comparative assessment of computer literacy of private and public secondary school students. Although the definition of computer literacy varies widely, this study treated computer literacy in terms of access to, and use of, computers and the internet, basic knowledge and skills required to use computers and…

  6. Configuring Embeddable Adaptive Computing Systems for Multiple Application Domains with Minimal Size, Weight, and Power

    Science.gov (United States)

    2003-09-01

    Lecture Notes in Computer Science 1388: Parallel and Distributed Processing, edited by Jose...Systems and Applications (EHPC ‘98), in Lecture Notes in Computer Science 1388: Parallel and Distributed Processing, edited by Jose Rolim, sponsor: IEEE...Applications (EHPC 2000), in Lecture Notes in Computer Science , IPDPS 2000 Workshops, sponsor: IEEE Computer Society, Cancun, Mexico, May 2000, pp. 776-

  7. Domain of the Gods: Do traditional beliefs hinder public acceptance of the human role in climate change?

    Science.gov (United States)

    Donner, S.

    2008-12-01

    Public acceptance of new scientific discoveries like natural selection, plate tectonics, or the human role in climate change naturally lags behind the pace of the discoveries. In the case of climate change, unease or outright rejection of the scientific evidence for the role of human activity in climate change has been a hindrance to mitigation and adaptation efforts. This skepticism is normally attributed to everything from the quality of science education, to disinformation campaigns by representatives of the coal and gas industry, to individual resistance to behavioral change, to the nature of the modern information culture. This skepticism of scientific evidence for climate change, though often inspired by politics, economics and the particular dynamics of climate change, may actually be rooted in ancient beliefs that the climate is beyond the influence of humans. In this presentation, I will outline how the notion that humans control or influence the weather runs contrary to thousands of years of belief in a separation between the earth - the domain of man - and sky - the domain of the gods. Evidence from religious history, traditional villages in the Pacific (Fjij and Kiribati) and from public discourse in North America all indicates that the millennia-old belief in an earth-sky separation hinders people's acceptance that human activity is affecting the climate. The human role in climate change therefore represents a substantial paradigm shift, similar to the role of natural selection in human evolution. These deep roots of climate change skepticism must be factored into public climate change education efforts.

  8. The Maritime Public Domain - concept and implementation in diferent national legal systems.

    Directory of Open Access Journals (Sweden)

    Marco Gameiro Antunes

    2014-05-01

    The paper will also describe how the ownership (public vs. private of coastal and estuarine margins is seen in some legal systems, considering the contribution of MPD to the protection of estuarine water and to the biodiversity resources.

  9. 32 CFR 705.35 - Armed Forces participation in events in the public domain.

    Science.gov (United States)

    2010-07-01

    ... Department of Defense to ensure compliance with public law, to assure equitable distribution of resources to... primary attraction. (4) Armed Forces participation is authorized in a fund-raising event only when the...

  10. Time-domain seismic modeling in viscoelastic media for full waveform inversion on heterogeneous computing platforms with OpenCL

    Science.gov (United States)

    Fabien-Ouellet, Gabriel; Gloaguen, Erwan; Giroux, Bernard

    2017-03-01

    Full Waveform Inversion (FWI) aims at recovering the elastic parameters of the Earth by matching recordings of the ground motion with the direct solution of the wave equation. Modeling the wave propagation for realistic scenarios is computationally intensive, which limits the applicability of FWI. The current hardware evolution brings increasing parallel computing power that can speed up the computations in FWI. However, to take advantage of the diversity of parallel architectures presently available, new programming approaches are required. In this work, we explore the use of OpenCL to develop a portable code that can take advantage of the many parallel processor architectures now available. We present a program called SeisCL for 2D and 3D viscoelastic FWI in the time domain. The code computes the forward and adjoint wavefields using finite-difference and outputs the gradient of the misfit function given by the adjoint state method. To demonstrate the code portability on different architectures, the performance of SeisCL is tested on three different devices: Intel CPUs, NVidia GPUs and Intel Xeon PHI. Results show that the use of GPUs with OpenCL can speed up the computations by nearly two orders of magnitudes over a single threaded application on the CPU. Although OpenCL allows code portability, we show that some device-specific optimization is still required to get the best performance out of a specific architecture. Using OpenCL in conjunction with MPI allows the domain decomposition of large models on several devices located on different nodes of a cluster. For large enough models, the speedup of the domain decomposition varies quasi-linearly with the number of devices. Finally, we investigate two different approaches to compute the gradient by the adjoint state method and show the significant advantages of using OpenCL for FWI.

  11. Secure encapsulation and publication of biological services in the cloud computing environment.

    Science.gov (United States)

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  12. A computer scientist’s evaluation of publically available hardware Trojan benchmarks

    OpenAIRE

    Slayback, Scott M.

    2015-01-01

    Approved for public release; distribution is unlimited Dr. Hassan Salmani and Dr. Mohammed Tehranipoor have developed a collection of publically available hardware Trojans, meant to be used as common benchmarks for the analysis of detection and mitigation techniques. In this thesis, we evaluate a selection of these Trojans from the perspective of a computer scientist with limited electrical engineering background. Note that this thesis is also intended to serve as a supplement to the exist...

  13. Production of radioactive phantoms using a standard inkjet printer and the public domain multi-printing code GENIA.

    Science.gov (United States)

    Scafè, R; Auer, P; Bennati, P; La Porta, L; Pisacane, F; Cinti, M N; Pellegrini, R; De Vincentis, G; Conte, G; Pani, R

    2011-10-01

    The public domain code GENIA, based on multi-printing method for producing surface sources with appropriate radioactivity, is described. The conventional technique, running on standard inkjet printer with radio-marked ink filling, is improved by repeating elementary printing commands in the same band. Well outlined sources with adjustable radioactivity can be obtained without refilling. The intrinsic limitation of printable radioactivity, depending on the value available at nozzles at printing time, was overcome. In addition the method permits the accurate calibration of the amount of activity released onto the paper.

  14. Democratizing Computer Science Knowledge: Transforming the Face of Computer Science through Public High School Education

    Science.gov (United States)

    Ryoo, Jean J.; Margolis, Jane; Lee, Clifford H.; Sandoval, Cueponcaxochitl D. M.; Goode, Joanna

    2013-01-01

    Despite the fact that computer science (CS) is the driver of technological innovations across all disciplines and aspects of our lives, including participatory media, high school CS too commonly fails to incorporate the perspectives and concerns of low-income students of color. This article describes a partnership program -- Exploring Computer…

  15. Efficient computation of turbulent flow in ribbed passages using a non-overlapping near-wall domain decomposition method

    Science.gov (United States)

    Jones, Adam; Utyuzhnikov, Sergey

    2017-08-01

    Turbulent flow in a ribbed channel is studied using an efficient near-wall domain decomposition (NDD) method. The NDD approach is formulated by splitting the computational domain into an inner and outer region, with an interface boundary between the two. The computational mesh covers the outer region, and the flow in this region is solved using the open-source CFD code Code_Saturne with special boundary conditions on the interface boundary, called interface boundary conditions (IBCs). The IBCs are of Robin type and incorporate the effect of the inner region on the flow in the outer region. IBCs are formulated in terms of the distance from the interface boundary to the wall in the inner region. It is demonstrated that up to 90% of the region between the ribs in the ribbed passage can be removed from the computational mesh with an error on the friction factor within 2.5%. In addition, computations with NDD are faster than computations based on low Reynolds number (LRN) models by a factor of five. Different rib heights can be studied with the same mesh in the outer region without affecting the accuracy of the friction factor. This is tested with six different rib heights in an example of a design optimisation study. It is found that the friction factors computed with NDD are almost identical to the fully-resolved results. When used for inverse problems, NDD is considerably more efficient than LRN computations because only one computation needs to be performed and only one mesh needs to be generated.

  16. Citing National Publications as a metrics for localization of Science: A Study on Scholar Journals of Social Science Domain in Iran from 2002 to 2010

    Directory of Open Access Journals (Sweden)

    Mohammad Tavakolizadeh-Ravari

    2015-02-01

    Conclusion: Based on this metric, the tendency toward the use of national publications in the social science domain is constant despite the number of scientific productions is growing in Iran. This means that the Iranian social science domain not only doesn’t tend to the science localization but its use of foreign science is growing in the span of time as well.

  17. Deciphering protein-protein interactions. Part II. Computational methods to predict protein and domain interaction partners

    National Research Council Canada - National Science Library

    Shoemaker, Benjamin A; Panchenko, Anna R

    2007-01-01

    .... In this review we describe different approaches to predict protein interaction partners as well as highlight recent achievements in the prediction of specific domains mediating protein-protein interactions...

  18. Sequence Tolerance of a Highly Stable Single Domain Antibody: Comparison of Computational and Experimental Profiles

    Science.gov (United States)

    2016-09-09

    popular construct of the heavy chain is single-domain antibody (sdAb) chains derived from camelids. The interest in sdAbs lies in their biotechnological ...thermal stable lama single domain antibody specific for Staphylococcus aureus enterotoxin B. BMC Biotechnology 11. 3. Zabetakis D, Anderson GP, Bayya N...neighbors from a Dali search of the single-chain conformers of the assembly. (a) Conformers of A3 where the color green denotes the A chain and blue

  19. Changing Perceptions of Homesteading as a Policy of Public Domain Disposal

    Science.gov (United States)

    Edwards, Richard

    2009-01-01

    The inspiring story of homesteaders claiming free land and realizing their dreams became one of the enduring narratives of American history. But scholars who have studied homesteading have often been much more ambivalent, even harshly negative, about how successful it was in practice. While the public often views our history differently from…

  20. Designing personal attentive user interfaces in the mobile public safety domain

    NARCIS (Netherlands)

    Streefkerk, J.W.; Esch van-Bussemakers, M.P.; Neerincx, M.A.

    2006-01-01

    In the mobile computing environment, there is a need to adapt the information and service provision to the momentary attentive state of the user, operational requirements and usage context. This paper proposes to design personal attentive user interfaces (PAUI) for which the content and style of inf

  1. An Exploratory Study of Malaysian Publication Productivity in Computer Science and Information Technology.

    Science.gov (United States)

    Gu, Yinian

    2002-01-01

    Explores the Malaysian computer science and information technology publication productivity as indicated by data collected from three Web-based databases. Relates possible reasons for the amount and pattern of contributions to the size of researcher population, the availability of refereed scholarly journals, and the total expenditure allocated to…

  2. WAVE MAKING COMPUTATION IN TIME DOMAIN FOR MULTI-HULL SHIPS

    Institute of Scientific and Technical Information of China (English)

    LI Guo-an; YE Heng-kui

    2006-01-01

    A method of three-dimensional time domain Green function satisfying linear conditions at free surface and body surface boundary was employed to analyze the wave resistance and wave profile of a displacement multi-hull ship. The wave profile induced by a moving time domain point source was compared with those by a Havelock source, and satisfactory results were obtained. The panel method based on the time domain source distribution on the ship mean wetted hull surface was used to perform the wave making com- putations for mono-hull ships, catamaran and trimaran. Reasonable results were also obtained. Using the numerical method the wave profile simulations of multi-hull ships for a given Froude number were conducted.

  3. Learning From Engineering and Computer Science About Communicating The Field To The Public

    Science.gov (United States)

    Moore, S. L.; Tucek, K.

    2014-12-01

    The engineering and computer science community has taken the lead in actively informing the public about their discipline, including the societal contributions and career opportunities. These efforts have been intensified in regards to informing underrepresented populations in STEM about engineering and computer science. Are there lessons to be learned by the geoscience community in communicating the societal impacts and career opportunities in the geosciences, especially in regards to broadening participation and meeting Next Generation Science Standards? An estimated 35 percent increase in the number of geoscientist jobs in the United States forecasted for the period between 2008 and 2018, combined with majority populations becoming minority populations, make it imperative that we improve how we increase the public's understanding of the geosciences and how we present our message to targeted populations. This talk will look at recommendations from the National Academy of Engineering's Changing the Conversation: Messages for Improving the Public Understanding of Engineering, and communication strategies by organizations such as Code.org, to highlight practices that the geoscience community can adopt to increase public awareness of the societal contributions of the geosciences, the career opportunities in the geosciences, and the importance of the geosciences in the Next Generation Science Standards. An effort to communicate geoscience to the public, Earth is Calling, will be compared and contrasted to these efforts, and used as an example of how geological societies and other organizations can engage the general public and targeted groups about the geosciences.

  4. Design and development of semantic web-based system for computer science domain-specific information retrieval

    Directory of Open Access Journals (Sweden)

    Ritika Bansal

    2016-09-01

    Full Text Available In semantic web-based system, the concept of ontology is used to search results by contextual meaning of input query instead of keyword matching. From the research literature, there seems to be a need for a tool which can provide an easy interface for complex queries in natural language that can retrieve the domain-specific information from the ontology. This research paper proposes an IRSCSD system (Information retrieval system for computer science domain as a solution. This system offers advanced querying and browsing of structured data with search results automatically aggregated and rendered directly in a consistent user-interface, thus reducing the manual effort of users. So, the main objective of this research is design and development of semantic web-based system for integrating ontology towards domain-specific retrieval support. Methodology followed is a piecemeal research which involves the following stages. First Stage involves the designing of framework for semantic web-based system. Second stage builds the prototype for the framework using Protégé tool. Third Stage deals with the natural language query conversion into SPARQL query language using Python-based QUEPY framework. Fourth Stage involves firing of converted SPARQL queries to the ontology through Apache's Jena API to fetch the results. Lastly, evaluation of the prototype has been done in order to ensure its efficiency and usability. Thus, this research paper throws light on framework development for semantic web-based system that assists in efficient retrieval of domain-specific information, natural language query interpretation into semantic web language, creation of domain-specific ontology and its mapping with related ontology. This research paper also provides approaches and metrics for ontology evaluation on prototype ontology developed to study the performance based on accessibility of required domain-related information.

  5. 41 CFR 102-75.100 - When an agency holds land withdrawn or reserved from the public domain and determines that it no...

    Science.gov (United States)

    2010-07-01

    ... land withdrawn or reserved from the public domain and determines that it no longer needs this land, what must it do? 102-75.100 Section 102-75.100 Public Contracts and Property Management Federal... it no longer needs this land, what must it do? An agency holding unneeded land withdrawn or...

  6. Fraction-free algorithm for the computation of diagonal forms matrices over Ore domains using Gr{\\"o}bner bases

    CERN Document Server

    Levandovskyy, Viktor

    2011-01-01

    This paper is a sequel to "Computing diagonal form and Jacobson normal form of a matrix using Groebner bases", J. of Symb. Computation, 46 (5), 2011. We present a new fraction-free algorithm for the computation of a diagonal form of a matrix over a certain non-commutative Euclidean domain over a computable field with the help of Gr\\"obner bases. This algorithm is formulated in a general constructive framework of non-commutative Ore localizations of $G$-algebras (OLGAs). We split the computation of a normal form of a matrix into the diagonalization and the normalization processes. Both of them can be made fraction-free. For a matrix $M$ over an OLGA we provide a diagonalization algorithm to compute $U,V$ and $D$ with fraction-free entries such that $UMV=D$ holds and $D$ is diagonal. The fraction-free approach gives us more information on the system of linear functional equations and its solutions, than the classical setup of an operator algebra with rational functions coefficients. In particular, one can handl...

  7. Configuration of the catalytic GIY-YIG domain of intron endonuclease I-TevI: coincidence of computational and molecular findings.

    OpenAIRE

    Kowalski, J C; Belfort, M; Stapleton, M A; Holpert, M; Dansereau, J T; Pietrokovski, S; Baxter, S M; Derbyshire, V

    1999-01-01

    I-TevI is a member of the GIY-YIG family of homing endonucleases. It is folded into two structural and functional domains, an N-terminal catalytic domain and a C-terminal DNA-binding domain, separated by a flexible linker. In this study we have used genetic analyses, computational sequence analysis andNMR spectroscopy to define the configuration of theN-terminal domain and its relationship to the flexible linker. The catalytic domain is an alpha/beta structure contained within the first 92 am...

  8. Computational and experimental investigations of magnetic domain structures in patterned magnetic thin films

    Science.gov (United States)

    Li, Yulan; Xu, Ke; Hu, Shenyang; Suter, Jon; Schreiber, Daniel K.; Ramuhalli, Pradeep; Johnson, Bradley R.; McCloy, John

    2015-08-01

    The use of nondestructive magnetic signatures for continuous monitoring of the degradation of structural materials in nuclear reactors is a promising yet challenging application for advanced functional materials behavior modeling and measurement. In this work, a numerical model, which is based on the Landau-Lifshitz-Gilbert equation of magnetization dynamics and the phase field approach, was developed to study the impact of defects such as nonmagnetic precipitates and/or voids, free surfaces and crystal orientation on magnetic domain structures and magnetic responses in magnetic materials, with the goal of exploring the correlation between microstructures and magnetic signatures. To validate the model, single crystal iron thin films (~240 nm thickness) were grown on MgO substrates and a focused ion beam was used to pattern micrometer-scale specimens with different geometries. Magnetic force microscopy (MFM) was used to measure magnetic domain structure and its field-dependence. Numerical simulations were constructed with the same geometry as the patterned specimens and under similar applied magnetic field conditions as tested by MFM. The results from simulations and experiments show that 1) magnetic domain structures strongly depend on the film geometry and the external applied field and 2) the predicted magnetic domain structures from the simulations agree quantitatively with those measured by MFM. The results demonstrate the capability of the developed model, used together with key experiments, for improving the understanding of the signal physics in magnetic sensing, thereby providing guidance to the development of advanced nondestructive magnetic techniques.

  9. A filtered convolution method for the computation of acoustic wave fields in very large spatiotemporal domains

    NARCIS (Netherlands)

    Verweij, M.D.; Huijssen, J.

    2009-01-01

    The full-wave computation of transient acoustic fields with sizes in the order of 100x100x100 wavelengths by 100 periods requires a numerical method that is extremely efficient in terms of storage and computation. Iterative integral equation methods offer a good performance on these points, provided

  10. Using Application-Domain Knowledge in the Runtime Support of Multi-Experiment Computational Studies

    Science.gov (United States)

    2009-01-01

    Ewa Deelman, Yolanda Gil, Carl Kesselman, Amit Agarwal, Gaurang Mehta, and Karan Vahi. The role of planning in grid computing. In Proceedings of the...Congress on Computational Intelligence, pages 82–87, 1994. [48] Ken Brodlie Jason Wood and Jeremy Walton. Gviz - visualization and steering for the grid

  11. Asthma in Urban Children: Epidemiology, Environmental Risk Factors, and the Public Health Domain.

    Science.gov (United States)

    Milligan, Ki Lee; Matsui, Elizabeth; Sharma, Hemant

    2016-04-01

    Asthma is the most commonly reported chronic condition of childhood in developed countries, with 6.5 million children affected in the USA. A disparate burden of childhood asthma is seen among socioeconomically disadvantaged youth, often concentrated in urban areas with high poverty rates. Host factors that predispose a child to asthma include atopy, male gender, parental history of asthma, and also race, ethnicity, and genetic and epigenetic susceptibilities. Environmental factors, such as improved hygiene, ambient air pollution, and early life exposures to microbes and aeroallergens, also influence the development of asthma. With greater than 90% of time spent indoors, home exposures (such as cockroach, rodent, and indoor air pollution) are highly relevant for urban asthma. Morbidity reduction may require focused public health initiatives for environmental intervention in high priority risk groups and the addition of immune modulatory agents in children with poorly controlled disease.

  12. A high-order public domain code for direct numerical simulations of turbulent combustion

    CERN Document Server

    Babkovskaia, N; Brandenburg, A

    2010-01-01

    A high-order scheme for direct numerical simulations of turbulent combustion is discussed. Its implementation in the massively parallel and publicly available Pencil Code is validated with the focus on hydrogen combustion. Ignition delay times (0D) and laminar flame velocities (1D) are calculated and compared with results from the commercially available Chemkin code. The scheme is verified to be fifth order in space. Upon doubling the resolution, a 32-fold increase in the accuracy of the flame front is demonstrated. Finally, also turbulent and spherical flame front velocities are calculated and the implementation of the non-reflecting so-called Navier-Stokes Characteristic Boundary Condition is validated in all three directions.

  13. Leveraging Cloud Computing to Address Public Health Disparities: An Analysis of the SPHPS.

    Science.gov (United States)

    Jalali, Arash; Olabode, Olusegun A; Bell, Christopher M

    2012-01-01

    As the use of certified electronic health record technology (CEHRT) has continued to gain prominence in hospitals and physician practices, public health agencies and health professionals have the ability to access health data through health information exchanges (HIE). With such knowledge health providers are well positioned to positively affect population health, and enhance health status or quality-of-life outcomes in at-risk populations. Through big data analytics, predictive analytics and cloud computing, public health agencies have the opportunity to observe emerging public health threats in real-time and provide more effective interventions addressing health disparities in our communities. The Smarter Public Health Prevention System (SPHPS) provides real-time reporting of potential public health threats to public health leaders through the use of a simple and efficient dashboard and links people with needed personal health services through mobile platforms for smartphones and tablets to promote and encourage healthy behaviors in our communities. The purpose of this working paper is to evaluate how a secure virtual private cloud (VPC) solution could facilitate the implementation of the SPHPS in order to address public health disparities.

  14. Development of mooring-anchor program in public domain for coupling with floater program for FOWTs (Floating Offshore Wind Turbines)

    Energy Technology Data Exchange (ETDEWEB)

    Kim, MooHyun [American Bureau of Shipping (ABS), Houston, TX (United States)

    2014-08-01

    This report presents the development of offshore anchor data sets which are intended to be used to develop a database that allows preliminary selection and sizing of anchors for the conceptual design of floating offshore wind turbines (FOWTs). The study is part of a project entitled “Development of Mooring-Anchor Program in Public Domain for Coupling with Floater Program for FOWTs (Floating Offshore Wind Turbines)”, under the direction of Dr. Moo-Hyun Kim at the Texas A&M University and with the sponsorship from the US Department of Energy (Contract No. DE-EE0005479, CFDA # 81.087 for DE-FOA-0000415, Topic Area 1.3: Subsurface Mooring and Anchoring Dynamics Models).

  15. Computational Study of Correlated Domain Motions in the AcrB Efflux Transporter

    Directory of Open Access Journals (Sweden)

    Robert Schulz

    2015-01-01

    Full Text Available As active part of the major efflux system in E. coli bacteria, AcrB is responsible for the uptake and pumping of toxic substrates from the periplasm toward the extracellular space. In combination with the channel protein TolC and membrane fusion protein AcrA, this efflux pump is able to help the bacterium to survive different kinds of noxious compounds. With the present study we intend to enhance the understanding of the interactions between the domains and monomers, for example, the transduction of mechanical energy from the transmembrane domain into the porter domain, correlated motions of different subdomains within monomers, and cooperative effects between monomers. To this end, targeted molecular dynamics simulations have been employed either steering the whole protein complex or specific parts thereof. By forcing only parts of the complex towards specific conformational states, the risk for transient artificial conformations during the simulations is reduced. Distinct cooperative effects between the monomers in AcrB have been observed. Possible allosteric couplings have been identified providing microscopic insights that might be exploited to design more efficient inhibitors of efflux systems.

  16. Object-Oriented Implementation of the Finite-Difference Time-Domain Method in Parallel Computing Environment

    Science.gov (United States)

    Chun, Kyungwon; Kim, Huioon; Hong, Hyunpyo; Chung, Youngjoo

    GMES which stands for GIST Maxwell's Equations Solver is a Python package for a Finite-Difference Time-Domain (FDTD) simulation. The FDTD method widely used for electromagnetic simulations is an algorithm to solve the Maxwell's equations. GMES follows Object-Oriented Programming (OOP) paradigm for the good maintainability and usability. With the several optimization techniques along with parallel computing environment, we could make the fast and interactive implementation. Execution speed has been tested in a single host and Beowulf class cluster. GMES is open source and available on the web (http://www.sf.net/projects/gmes).

  17. A model of Cross Language Retrieval for IT domain papers through a map of ACM Computing Classification System

    CERN Document Server

    Kembellec, Gérald; Sauvaget, Catherine

    2011-01-01

    This article presents a concept model, and the associated tool to help advanced learners to find adapted bibliography. The purpose is the use of an IT representation as educational research software for newcomers in research. We use an ontology based on the ACM's Computing Classification System in order to find scientific articles directly related to the new researcher's domain without any formal request. An ontology translation in French is automatically proposed and can be based on Web 2.0 enhanced by a community of users. A visualization and navigation model is proposed to make it more accessible and examples are given to show the interface of our tool: Ontology Navigator.

  18. Open window: when easily identifiable genomes and traits are in the public domain.

    Directory of Open Access Journals (Sweden)

    Misha Angrist

    Full Text Available "One can't be of an enquiring and experimental nature, and still be very sensible."--Charles Fort. As the costs of personal genetic testing "self-quantification" fall, publicly accessible databases housing people's genotypic and phenotypic information are gradually increasing in number and scope. The latest entrant is openSNP, which allows participants to upload their personal genetic/genomic and self-reported phenotypic data. I believe the emergence of such open repositories of human biological data is a natural reflection of inquisitive and digitally literate people's desires to make genomic and phenotypic information more easily available to a community beyond the research establishment. Such unfettered databases hold the promise of contributing mightily to science, science education and medicine. That said, in an age of increasingly widespread governmental and corporate surveillance, we would do well to be mindful that genomic DNA is uniquely identifying. Participants in open biological databases are engaged in a real-time experiment whose outcome is unknown.

  19. Evolution of Industry Knowledge in the Public Domain: Prior Art Searching for Software Patents

    Directory of Open Access Journals (Sweden)

    Jinseok Park

    2005-03-01

    Full Text Available Searching prior art is a key part of the patent application and examination processes. A comprehensive prior art search gives the inventor ideas as to how he can improve or circumvent existing technology by providing up to date knowledge on the state of the art. It also enables the patent applicant to minimise the likelihood of an objection from the patent office. This article explores the characteristics of prior art associated with software patents, dealing with difficulties in searching prior art due to the lack of resources, and considers public contribution to the formation of prior art databases. It addresses the evolution of electronic prior art in line with technological development, and discusses laws and practices in the EPO, USPTO, and the JPO in relation to the validity of prior art resources on the Internet. This article also investigates the main features of searching sources and tools in the three patent offices as well as non-patent literature databases. Based on the analysis of various searching databases, it provides some strategies of efficient prior art searching that should be considered for software-related inventions.

  20. Taking the High Ground: A Case for Department of Defense Application of Public Cloud Computing

    Science.gov (United States)

    2011-06-01

    IT cannot be sustained in a declining budget environment with users demanding better services. Wyld captures the essence of much of the problem for...the DoD laboratory data centers into model versions of public providers. An open source project, called Eucalyptus (http://www.eucalyptus.com), would...be an excellent starting point for such a project. Eucalyptus is a software plat- form for implementing private cloud computing solutions on top of

  1. Monitoring Urban Tree Cover Using Object-Based Image Analysis and Public Domain Remotely Sensed Data

    Directory of Open Access Journals (Sweden)

    Meghan Halabisky

    2011-10-01

    Full Text Available Urban forest ecosystems provide a range of social and ecological services, but due to the heterogeneity of these canopies their spatial extent is difficult to quantify and monitor. Traditional per-pixel classification methods have been used to map urban canopies, however, such techniques are not generally appropriate for assessing these highly variable landscapes. Landsat imagery has historically been used for per-pixel driven land use/land cover (LULC classifications, but the spatial resolution limits our ability to map small urban features. In such cases, hyperspatial resolution imagery such as aerial or satellite imagery with a resolution of 1 meter or below is preferred. Object-based image analysis (OBIA allows for use of additional variables such as texture, shape, context, and other cognitive information provided by the image analyst to segment and classify image features, and thus, improve classifications. As part of this research we created LULC classifications for a pilot study area in Seattle, WA, USA, using OBIA techniques and freely available public aerial photography. We analyzed the differences in accuracies which can be achieved with OBIA using multispectral and true-color imagery. We also compared our results to a satellite based OBIA LULC and discussed the implications of per-pixel driven vs. OBIA-driven field sampling campaigns. We demonstrated that the OBIA approach can generate good and repeatable LULC classifications suitable for tree cover assessment in urban areas. Another important finding is that spectral content appeared to be more important than spatial detail of hyperspatial data when it comes to an OBIA-driven LULC.

  2. Informed public choices for low-carbon electricity portfolios using a computer decision tool.

    Science.gov (United States)

    Mayer, Lauren A Fleishman; Bruine de Bruin, Wändi; Morgan, M Granger

    2014-04-01

    Reducing CO2 emissions from the electricity sector will likely require policies that encourage the widespread deployment of a diverse mix of low-carbon electricity generation technologies. Public discourse informs such policies. To make informed decisions and to productively engage in public discourse, citizens need to understand the trade-offs between electricity technologies proposed for widespread deployment. Building on previous paper-and-pencil studies, we developed a computer tool that aimed to help nonexperts make informed decisions about the challenges faced in achieving a low-carbon energy future. We report on an initial usability study of this interactive computer tool. After providing participants with comparative and balanced information about 10 electricity technologies, we asked them to design a low-carbon electricity portfolio. Participants used the interactive computer tool, which constrained portfolio designs to be realistic and yield low CO2 emissions. As they changed their portfolios, the tool updated information about projected CO2 emissions, electricity costs, and specific environmental impacts. As in the previous paper-and-pencil studies, most participants designed diverse portfolios that included energy efficiency, nuclear, coal with carbon capture and sequestration, natural gas, and wind. Our results suggest that participants understood the tool and used it consistently. The tool may be downloaded from http://cedmcenter.org/tools-for-cedm/informing-the-public-about-low-carbon-technologies/ .

  3. Serum Albumin Domain Structures in Human Blood Serum by Mass Spectrometry and Computational Biology.

    Science.gov (United States)

    Belsom, Adam; Schneider, Michael; Fischer, Lutz; Brock, Oliver; Rappsilber, Juri

    2016-03-01

    Chemical cross-linking combined with mass spectrometry has proven useful for studying protein-protein interactions and protein structure, however the low density of cross-link data has so far precluded its use in determining structures de novo. Cross-linking density has been typically limited by the chemical selectivity of the standard cross-linking reagents that are commonly used for protein cross-linking. We have implemented the use of a heterobifunctional cross-linking reagent, sulfosuccinimidyl 4,4'-azipentanoate (sulfo-SDA), combining a traditional sulfo-N-hydroxysuccinimide (sulfo-NHS) ester and a UV photoactivatable diazirine group. This diazirine yields a highly reactive and promiscuous carbene species, the net result being a greatly increased number of cross-links compared with homobifunctional, NHS-based cross-linkers. We present a novel methodology that combines the use of this high density photo-cross-linking data with conformational space search to investigate the structure of human serum albumin domains, from purified samples, and in its native environment, human blood serum. Our approach is able to determine human serum albumin domain structures with good accuracy: root-mean-square deviation to crystal structure are 2.8/5.6/2.9 Å (purified samples) and 4.5/5.9/4.8Å (serum samples) for domains A/B/C for the first selected structure; 2.5/4.9/2.9 Å (purified samples) and 3.5/5.2/3.8 Å (serum samples) for the best out of top five selected structures. Our proof-of-concept study on human serum albumin demonstrates initial potential of our approach for determining the structures of more proteins in the complex biological contexts in which they function and which they may require for correct folding. Data are available via ProteomeXchange with identifier PXD001692.

  4. Linear-scaling computation of excited states in time-domain

    Institute of Scientific and Technical Information of China (English)

    YAM ChiYung; CHEN GuanHua

    2014-01-01

    The applicability of quantum mechanical methods is severely limited by their poor scaling.To circumvent the problem,linearscaling methods for quantum mechanical calculations had been developed.The physical basis of linear-scaling methods is the locality in quantum mechanics where the properties or observables of a system are weakly influenced by factors spatially far apart.Besides the substantial efforts spent on devising linear-scaling methods for ground state,there is also a growing interest in the development of linear-scaling methods for excited states.This review gives an overview of linear-scaling approaches for excited states solved in real time-domain.

  5. 6th International Workshop on Computer-Aided Scheduling of Public Transport

    CERN Document Server

    Branco, Isabel; Paixão, José

    1995-01-01

    This proceedings volume consists of papers presented at the Sixth International Workshop on Computer-Aided Scheduling of Public Transpon, which was held at the Fund~lio Calouste Gulbenkian in Lisbon from July 6th to 9th, 1993. In the tradition of alternating Workshops between North America and Europe - Chicago (1975), Leeds (1980), Montreal (1983), Hamburg (1987) and again Montreal (1990), the European city of Lisbon was selected as the venue for the Workshop in 1993. As in earlier Workshops, the central theme dealt with vehicle and duty scheduling problems and the employment of operations-research-based software systems for operational planning in public transport. However, as was initiated in Hamburg in 1987, the scope of this Workshop was broadened to include topics in related fields. This fundamental alteration was an inevitable consequence of the growing demand over the last decade for solutions to the complete planning process in public transport through integrated systems. Therefore, the program of thi...

  6. Seamless integration of cross-domain data centers in cloud computing%云计算数据中心间的跨域无缝融合

    Institute of Scientific and Technical Information of China (English)

    潘毅; 梁勇

    2014-01-01

    随着业务的发展,电信运营商逐渐转型为云计算的提供者。为构建大规模的云计算平台,多个数据中心需要跨域无缝融合。本文以中国移动通信集团广东有限公司公众云南基节点和中能节点双中心的成功案例介绍其OTV(Over Transport Virtualization)实现的原理和实施细节。%With the development of business, telecom operators gradually transit to be cloud computing providers. To build a large-scale cloud computing platform, several data centers in different domains should be integrated seamlessly. Based on the successful dual-center integration case of Nanji cite and Zhongneng cite of Guangdong Mobile public cloud, the implementation details and the key technique OTV are presented.

  7. Radiological Protection in Cone Beam Computed Tomography (CBCT). ICRP Publication 129.

    Science.gov (United States)

    Rehani, M M; Gupta, R; Bartling, S; Sharp, G C; Pauwels, R; Berris, T; Boone, J M

    2015-07-01

    The objective of this publication is to provide guidance on radiological protection in the new technology of cone beam computed tomography (CBCT). Publications 87 and 102 dealt with patient dose management in computed tomography (CT) and multi-detector CT. The new applications of CBCT and the associated radiological protection issues are substantially different from those of conventional CT. The perception that CBCT involves lower doses was only true in initial applications. CBCT is now used widely by specialists who have little or no training in radiological protection. This publication provides recommendations on radiation dose management directed at different stakeholders, and covers principles of radiological protection, training, and quality assurance aspects. Advice on appropriate use of CBCT needs to be made widely available. Advice on optimisation of protection when using CBCT equipment needs to be strengthened, particularly with respect to the use of newer features of the equipment. Manufacturers should standardise radiation dose displays on CBCT equipment to assist users in optimisation of protection and comparisons of performance. Additional challenges to radiological protection are introduced when CBCT-capable equipment is used for both fluoroscopy and tomography during the same procedure. Standardised methods need to be established for tracking and reporting of patient radiation doses from these procedures. The recommendations provided in this publication may evolve in the future as CBCT equipment and applications evolve. As with previous ICRP publications, the Commission hopes that imaging professionals, medical physicists, and manufacturers will use the guidelines and recommendations provided in this publication for implementation of the Commission's principle of optimisation of protection of patients and medical workers, with the objective of keeping exposures as low as reasonably achievable, taking into account economic and societal factors, and

  8. THE USE OF COMPUTER APPLICATIONS IN THE STUDY OF ROMANIA'S PUBLIC DEBT

    Directory of Open Access Journals (Sweden)

    Popeanga Vasile

    2011-07-01

    Full Text Available Total public debt represents all monetary obligations of the state (government, public institutions, financial, administrative-territorial units at a time, resulting from internal and external loans (in lei and foreign currencies contracted on short, medium and long term, and the state treasury and its own obligations for the amounts advanced temporarily to cover the budget deficit. Loans may be contracted by the state through the Ministry of Finance, in his own name or guaranteed by it. Public debt is expressed in local currency or foreign currency, depending on where the contracts and loan conditions. In order to evaluate Romania's public debt, obligations denominated in another currency than the national currency is calculated using the exchange rate of National Bank of Romania. Also, total public debt of a country can be expressed in absolute values (to know the load on that country's economy which is subject to its creditors, the relative values as a percentage of GDP (to allow comparison over time and between countries and the average size per capita (to allow comparisons and analysis in time and space. Total public debt is calculated and separately manages its two forms, namely domestic public debt and external public debt. Ministry of Finance shall prepare and submit annually to the Government for approval and to Parliament for information, report on public debt, which contains information on government debt portfolio, debt service, public indebtedness indicators and information about primary and secondary market securities state and how to implement the medium-term strategy in managing government debt for the previous year. In order to make comparisons quick and effective on public debt dynamics in Romania, Excel 2010 has new features such as charts and sparkline slicers features which can help discover trends and statistics in accordance with existing data. The aim of this article is accurate assessment of Romania's public debt and its

  9. 78 FR 54453 - Notice of Public Meeting-Intersection of Cloud Computing and Mobility Forum and Workshop

    Science.gov (United States)

    2013-09-04

    ... National Institute of Standards and Technology Notice of Public Meeting--Intersection of Cloud Computing...-mobility.cfm . SUPPLEMENTARY INFORMATION: NIST hosted six prior Cloud Computing Forum & Workshop events in..., portability, and security, discuss the Federal Government's experience with cloud computing, report on...

  10. Computational inference of H3K4me3 and H3K27ac domain length

    Directory of Open Access Journals (Sweden)

    Julian Zubek

    2016-03-01

    Full Text Available Background. Recent epigenomic studies have shown that the length of a DNA region covered by an epigenetic mark is not just a byproduct of the assaying technologies and has functional implications for that locus. For example, expanded regions of DNA sequences that are marked by enhancer-specific histone modifications, such as acetylation of histone H3 lysine 27 (H3K27ac domains coincide with cell-specific enhancers, known as super or stretch enhancers. Similarly, promoters of genes critical for cell-specific functions are marked by expanded H3K4me3 domains in the cognate cell type, and these can span DNA regions from 4–5kb up to 40–50kb in length. These expanded H3K4me3 domains are known as buffer domains or super promoters. Methods. To ask what correlates with—and potentially regulates—the length of loci marked with these two important histone marks, H3K4me3 and H3K27ac, we built Random Forest regression models. With these models, we computationally identified genomic and epigenomic patterns that are predictive for the length of these marks in seven ENCODE cell lines. Results. We found that certain epigenetic marks and transcription factors explain the variability of the length of H3K4me3 and H3K27ac marks across different cell types, which implies that the lengths of these two epigenetic marks are tightly regulated in a given cell type. Our source code for the regression models and data can be found at our GitHub page: https://github.com/zubekj/broad_peaks. Discussion. Our Random Forest based regression models enabled us to estimate the individual contribution of different epigenetic marks and protein binding patterns to the length of H3K4me3 and H3K27ac deposition patterns, therefore potentially revealing genomic signatures at cell specific regulatory elements.

  11. A Discontinuous Galerkin Time-Domain Method with Dynamically Adaptive Cartesian Meshes for Computational Electromagnetics

    CERN Document Server

    Yan, Su; Arslanbekov, Robert R; Kolobov, Vladimir I; Jin, Jian-Ming

    2016-01-01

    A discontinuous Galerkin time-domain (DGTD) method based on dynamically adaptive Cartesian meshes (ACM) is developed for a full-wave analysis of electromagnetic fields in dispersive media. Hierarchical Cartesian grids offer simplicity close to that of structured grids and the flexibility of unstructured grids while being highly suited for adaptive mesh refinement (AMR). The developed DGTD-ACM achieves a desired accuracy by refining non-conformal meshes near material interfaces to reduce stair-casing errors without sacrificing the high efficiency afforded with uniform Cartesian meshes. Moreover, DGTD-ACM can dynamically refine the mesh to resolve the local variation of the fields during propagation of electromagnetic pulses. A local time-stepping scheme is adopted to alleviate the constraint on the time-step size due to the stability condition of the explicit time integration. Simulations of electromagnetic wave diffraction over conducting and dielectric cylinders and spheres demonstrate that the proposed meth...

  12. Time domain measurement representation in computer system diagnostics and performance analysis

    Directory of Open Access Journals (Sweden)

    Stanisław Wideł

    2013-06-01

    Full Text Available Time analysis is a common approach for testing and detecting methods for the performance analysis of computer systems. In the article it is shown, that measuring and identifying performances based on a benchmark is not sufficient for the proper analysis of the computer systems behavior. The response time of the process is often composed of the execution of many subprocesses or many paths of execution. Under this assumption, it is presented, that both convolution and deconvolution methods can be helpful in obtaining time distributions and modeling of complex processes. In such a modeling the analysis of measurement errors is very important and was taken into consideration. The example of using the methods in buffering process is also discussed.

  13. Solving Problems in Various Domains by Hybrid Models of High Performance Computations

    Directory of Open Access Journals (Sweden)

    Yurii Rogozhin

    2014-03-01

    Full Text Available This work presents a hybrid model of high performance computations. The model is based on membrane system (P~system where some membranes may contain quantum device that is triggered by the data entering the membrane. This model is supposed to take advantages of both biomolecular and quantum paradigms and to overcome some of their inherent limitations. The proposed approach is demonstrated through two selected problems: SAT, and image retrieving.

  14. State-of-the-art soft computing techniques in image steganography domain

    Science.gov (United States)

    Hussain, Hanizan Shaker; Din, Roshidi; Samad, Hafiza Abdul; Yaacub, Mohd Hanafizah; Murad, Roslinda; Rukhiyah, A.; Sabdri, Noor Maizatulshima

    2016-08-01

    This paper reviews major works of soft computing (SC) techniques in image steganography and watermarking in the last ten years, focusing on three main SC techniques, which are neural network, genetic algorithm, and fuzzy logic. The findings suggests that all these works applied SC techniques either during pre-processing, embedding or extracting stages or more than one of these stages. Therefore, the presence of SC techniques with their diverse approaches and strengths can help researchers in future work to attain excellent quality of image information hiding that comprises both imperceptibility and robustness.

  15. Space-frequency analysis with parallel computing in a phase-sensitive optical time-domain reflectometer distributed sensor.

    Science.gov (United States)

    Hui, Xiaonan; Ye, Taihang; Zheng, Shilie; Zhou, Jinhai; Chi, Hao; Jin, Xiaofeng; Zhang, Xianmin

    2014-10-01

    For a phase-sensitive optical time-domain reflectometer (ϕ-OTDR) distributed sensor system, space-frequency analysis can reduce the false alarm by analyzing the frequency distribution compared with the traditional difference value method. We propose a graphics processing unit (GPU)-based parallel computing method to perform multichannel fast Fourier transform (FFT) and realize the real-time space-frequency analysis. The experiment results show that the time taken by the multichannel FFT decreased considerably based on this GPU parallel computing. The method can be completed with a sensing fiber up to 16 km long and an entry-level GPU. Meanwhile, the GPU can reduce the computing load of the central processing unit from 70% down to less than 20%. We carried out an experiment on a two-point space-frequency analysis, and the results clearly and simultaneously show the vibration point locations and frequency components. The sensor system outputs the real-time space-frequency spectra continuously with a spatial resolution of 16.3 m and frequency resolution of 2.25 Hz.

  16. Highly accurate and efficient self-force computations using time-domain methods: Error estimates, validation, and optimization

    CERN Document Server

    Thornburg, Jonathan

    2010-01-01

    If a small "particle" of mass $\\mu M$ (with $\\mu \\ll 1$) orbits a Schwarzschild or Kerr black hole of mass $M$, the particle is subject to an $\\O(\\mu)$ radiation-reaction "self-force". Here I argue that it's valuable to compute this self-force highly accurately (relative error of $\\ltsim 10^{-6}$) and efficiently, and I describe techniques for doing this and for obtaining and validating error estimates for the computation. I use an adaptive-mesh-refinement (AMR) time-domain numerical integration of the perturbation equations in the Barack-Ori mode-sum regularization formalism; this is efficient, yet allows easy generalization to arbitrary particle orbits. I focus on the model problem of a scalar particle in a circular geodesic orbit in Schwarzschild spacetime. The mode-sum formalism gives the self-force as an infinite sum of regularized spherical-harmonic modes $\\sum_{\\ell=0}^\\infty F_{\\ell,\\reg}$, with $F_{\\ell,\\reg}$ (and an "internal" error estimate) computed numerically for $\\ell \\ltsim 30$ and estimated ...

  17. Computer-aided classification of rheumatoid arthritis in finger joints using frequency domain optical tomography

    Science.gov (United States)

    Klose, C. D.; Kim, H. K.; Netz, U.; Blaschke, S.; Zwaka, P. A.; Mueller, G. A.; Beuthan, J.; Hielscher, A. H.

    2009-02-01

    Novel methods that can help in the diagnosis and monitoring of joint disease are essential for efficient use of novel arthritis therapies that are currently emerging. Building on previous studies that involved continuous wave imaging systems we present here first clinical data obtained with a new frequency-domain imaging system. Three-dimensional tomographic data sets of absorption and scattering coefficients were generated for 107 fingers. The data were analyzed using ANOVA, MANOVA, Discriminant Analysis DA, and a machine-learning algorithm that is based on self-organizing mapping (SOM) for clustering data in 2-dimensional parameter spaces. Overall we found that the SOM algorithm outperforms the more traditional analysis methods in terms of correctly classifying finger joints. Using SOM, healthy and affected joints can now be separated with a sensitivity of 0.97 and specificity of 0.91. Furthermore, preliminary results suggest that if a combination of multiple image properties is used, statistical significant differences can be found between RA-affected finger joints that show different clinical features (e.g. effusion, synovitis or erosion).

  18. Multispectral medical image fusion in Contourlet domain for computer based diagnosis of Alzheimer's disease

    Science.gov (United States)

    Bhateja, Vikrant; Moin, Aisha; Srivastava, Anuja; Bao, Le Nguyen; Lay-Ekuakille, Aimé; Le, Dac-Nhuong

    2016-07-01

    Computer based diagnosis of Alzheimer's disease can be performed by dint of the analysis of the functional and structural changes in the brain. Multispectral image fusion deliberates upon fusion of the complementary information while discarding the surplus information to achieve a solitary image which encloses both spatial and spectral details. This paper presents a Non-Sub-sampled Contourlet Transform (NSCT) based multispectral image fusion model for computer-aided diagnosis of Alzheimer's disease. The proposed fusion methodology involves color transformation of the input multispectral image. The multispectral image in YIQ color space is decomposed using NSCT followed by dimensionality reduction using modified Principal Component Analysis algorithm on the low frequency coefficients. Further, the high frequency coefficients are enhanced using non-linear enhancement function. Two different fusion rules are then applied to the low-pass and high-pass sub-bands: Phase congruency is applied to low frequency coefficients and a combination of directive contrast and normalized Shannon entropy is applied to high frequency coefficients. The superiority of the fusion response is depicted by the comparisons made with the other state-of-the-art fusion approaches (in terms of various fusion metrics).

  19. Multispectral medical image fusion in Contourlet domain for computer based diagnosis of Alzheimer’s disease

    Energy Technology Data Exchange (ETDEWEB)

    Bhateja, Vikrant, E-mail: bhateja.vikrant@gmail.com, E-mail: nhuongld@hus.edu.vn; Moin, Aisha; Srivastava, Anuja [Shri Ramswaroop Memorial Group of Professional Colleges (SRMGPC), Lucknow, Uttar Pradesh 226028 (India); Bao, Le Nguyen [Duytan University, Danang 550000 (Viet Nam); Lay-Ekuakille, Aimé [Department of Innovation Engineering, University of Salento, Lecce 73100 (Italy); Le, Dac-Nhuong, E-mail: bhateja.vikrant@gmail.com, E-mail: nhuongld@hus.edu.vn [Duytan University, Danang 550000 (Viet Nam); Haiphong University, Haiphong 180000 (Viet Nam)

    2016-07-15

    Computer based diagnosis of Alzheimer’s disease can be performed by dint of the analysis of the functional and structural changes in the brain. Multispectral image fusion deliberates upon fusion of the complementary information while discarding the surplus information to achieve a solitary image which encloses both spatial and spectral details. This paper presents a Non-Sub-sampled Contourlet Transform (NSCT) based multispectral image fusion model for computer-aided diagnosis of Alzheimer’s disease. The proposed fusion methodology involves color transformation of the input multispectral image. The multispectral image in YIQ color space is decomposed using NSCT followed by dimensionality reduction using modified Principal Component Analysis algorithm on the low frequency coefficients. Further, the high frequency coefficients are enhanced using non-linear enhancement function. Two different fusion rules are then applied to the low-pass and high-pass sub-bands: Phase congruency is applied to low frequency coefficients and a combination of directive contrast and normalized Shannon entropy is applied to high frequency coefficients. The superiority of the fusion response is depicted by the comparisons made with the other state-of-the-art fusion approaches (in terms of various fusion metrics).

  20. [Design and study of parallel computing environment of Monte Carlo simulation for particle therapy planning using a public cloud-computing infrastructure].

    Science.gov (United States)

    Yokohama, Noriya

    2013-07-01

    This report was aimed at structuring the design of architectures and studying performance measurement of a parallel computing environment using a Monte Carlo simulation for particle therapy using a high performance computing (HPC) instance within a public cloud-computing infrastructure. Performance measurements showed an approximately 28 times faster speed than seen with single-thread architecture, combined with improved stability. A study of methods of optimizing the system operations also indicated lower cost.

  1. Artificial proteins as allosteric modulators of PDZ3 and SH3 in two-domain constructs: A computational characterization of novel chimeric proteins.

    Science.gov (United States)

    Kirubakaran, Palani; Pfeiferová, Lucie; Boušová, Kristýna; Bednarova, Lucie; Obšilová, Veronika; Vondrášek, Jiří

    2016-10-01

    Artificial multidomain proteins with enhanced structural and functional properties can be utilized in a broad spectrum of applications. The design of chimeric fusion proteins utilizing protein domains or one-domain miniproteins as building blocks is an important advancement for the creation of new biomolecules for biotechnology and medical applications. However, computational studies to describe in detail the dynamics and geometry properties of two-domain constructs made from structurally and functionally different proteins are lacking. Here, we tested an in silico design strategy using all-atom explicit solvent molecular dynamics simulations. The well-characterized PDZ3 and SH3 domains of human zonula occludens (ZO-1) (3TSZ), along with 5 artificial domains and 2 types of molecular linkers, were selected to construct chimeric two-domain molecules. The influence of the artificial domains on the structure and dynamics of the PDZ3 and SH3 domains was determined using a range of analyses. We conclude that the artificial domains can function as allosteric modulators of the PDZ3 and SH3 domains. Proteins 2016; 84:1358-1374. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  2. Use of media and public-domain Internet sources for detection and assessment of plant health threats

    Directory of Open Access Journals (Sweden)

    David M. Hartley

    2011-09-01

    Full Text Available Event-based biosurveillance is a recognized approach to early warning and situational awareness of emerging health threats. In this study, we build upon previous human and animal health work to develop a new approach to plant pest and pathogen surveillance. We show that monitoring public domain electronic media for indications and warning of epidemics and associated social disruption can provide information about the emergence and progression of plant pest infestation or disease outbreak. The approach is illustrated using a case study, which describes a plant pest and pathogen epidemic in China and Vietnam from February 2006 to December 2007, and the role of ducks in contributing to zoonotic virus spread in birds and humans. This approach could be used as a complementary method to traditional plant pest and pathogen surveillance to aid global and national plant protection officials and political leaders in early detection and timely response to significant biological threats to plant health, economic vitality, and social stability. This study documents the inter-relatedness of health in human, animal, and plant populations and emphasizes the importance of plant health surveillance.

  3. An Overview of a Decade of Journal Publications about Culture and Human-Computer Interaction (HCI)

    Science.gov (United States)

    Clemmensen, Torkil; Roese, Kerstin

    In this paper, we analyze the concept of human-computer interaction in cultural and national contexts. Building and extending upon the framework for understanding research in usability and culture by Honold [3], we give an overview of publications in culture and HCI between 1998 and 2008, with a narrow focus on high-level journal publications only. The purpose is to review current practice in how cultural HCI issues are studied, and to analyse problems with the measures and interpretation of this studies. We find that Hofstede's cultural dimensions has been the dominating model of culture, participants have been picked because they could speak English, and most studies have been large scale quantitative studies. In order to balance this situation, we recommend that more researchers and practitioners do qualitative, empirical work studies.

  4. Characterization of calmodulin-Fas death domain interaction: an integrated experimental and computational study.

    Science.gov (United States)

    Fancy, Romone M; Wang, Lingyun; Napier, Tiara; Lin, Jiabei; Jing, Gu; Lucius, Aaron L; McDonald, Jay M; Zhou, Tong; Song, Yuhua

    2014-04-29

    The Fas death receptor-activated death-inducing signaling complex (DISC) regulates apoptosis in many normal and cancer cells. Qualitative biochemical experiments demonstrate that calmodulin (CaM) binds to the death domain of Fas. The interaction between CaM and Fas regulates Fas-mediated DISC formation. A quantitative understanding of the interaction between CaM and Fas is important for the optimal design of antagonists for CaM or Fas to regulate the CaM-Fas interaction, thus modulating Fas-mediated DISC formation and apoptosis. The V254N mutation of the Fas death domain (Fas DD) is analogous to an identified mutant allele of Fas in lpr-cg mice that have a deficiency in Fas-mediated apoptosis. In this study, the interactions of CaM with the Fas DD wild type (Fas DD WT) and with the Fas DD V254N mutant were characterized using isothermal titration calorimetry (ITC), circular dichroism spectroscopy (CD), and molecular dynamics (MD) simulations. ITC results reveal an endothermic binding characteristic and an entropy-driven interaction of CaM with Fas DD WT or with Fas DD V254N. The Fas DD V254N mutation decreased the association constant (Ka) for CaM-Fas DD binding from (1.79 ± 0.20) × 10(6) to (0.88 ± 0.14) × 10(6) M(-1) and slightly increased a standard state Gibbs free energy (ΔG°) for CaM-Fas DD binding from -8.87 ± 0.07 to -8.43 ± 0.10 kcal/mol. CD secondary structure analysis and MD simulation results did not show significant secondary structural changes of the Fas DD caused by the V254N mutation. The conformational and dynamical motion analyses, the analyses of hydrogen bond formation within the CaM binding region, the contact numbers of each residue, and the electrostatic potential for the CaM binding region based on MD simulations demonstrated changes caused by the Fas DD V254N mutation. These changes caused by the Fas DD V254N mutation could affect the van der Waals interactions and electrostatic interactions between CaM and Fas DD, thereby affecting

  5. Compression and denoising in magnetic resonance imaging via SVD on the Fourier domain using computer algebra

    Science.gov (United States)

    Díaz, Felipe

    2015-09-01

    Magnetic resonance (MR) data reconstruction can be computationally a challenging task. The signal-to-noise ratio might also present complications, especially with high-resolution images. In this sense, data compression can be useful not only for reducing the complexity and memory requirements, but also to reduce noise, even to allow eliminate spurious components.This article proposes the use of a system based on singular value decomposition of low order for noise reconstruction and reduction in MR imaging system. The proposed method is evaluated using in vivo MRI data. Rebuilt images with less than 20 of the original data and with similar quality in terms of visual inspection are presented. Also a quantitative evaluation of the method is presented.

  6. Computer-Aided Clinical Trial Recruitment Based on Domain-Specific Language Translation: A Case Study of Retinopathy of Prematurity

    Directory of Open Access Journals (Sweden)

    Yinsheng Zhang

    2017-01-01

    Full Text Available Reusing the data from healthcare information systems can effectively facilitate clinical trials (CTs. How to select candidate patients eligible for CT recruitment criteria is a central task. Related work either depends on DBA (database administrator to convert the recruitment criteria to native SQL queries or involves the data mapping between a standard ontology/information model and individual data source schema. This paper proposes an alternative computer-aided CT recruitment paradigm, based on syntax translation between different DSLs (domain-specific languages. In this paradigm, the CT recruitment criteria are first formally represented as production rules. The referenced rule variables are all from the underlying database schema. Then the production rule is translated to an intermediate query-oriented DSL (e.g., LINQ. Finally, the intermediate DSL is directly mapped to native database queries (e.g., SQL automated by ORM (object-relational mapping.

  7. Computer-Aided Clinical Trial Recruitment Based on Domain-Specific Language Translation: A Case Study of Retinopathy of Prematurity

    Science.gov (United States)

    2017-01-01

    Reusing the data from healthcare information systems can effectively facilitate clinical trials (CTs). How to select candidate patients eligible for CT recruitment criteria is a central task. Related work either depends on DBA (database administrator) to convert the recruitment criteria to native SQL queries or involves the data mapping between a standard ontology/information model and individual data source schema. This paper proposes an alternative computer-aided CT recruitment paradigm, based on syntax translation between different DSLs (domain-specific languages). In this paradigm, the CT recruitment criteria are first formally represented as production rules. The referenced rule variables are all from the underlying database schema. Then the production rule is translated to an intermediate query-oriented DSL (e.g., LINQ). Finally, the intermediate DSL is directly mapped to native database queries (e.g., SQL) automated by ORM (object-relational mapping).

  8. How Large Is the "Public Domain"? A Comparative Analysis of Ringer's 1961 Copyright Renewal Study and HathiTrust CRMS Data

    Science.gov (United States)

    Wilkin, John P.

    2017-01-01

    The 1961 Copyright Office study on renewals, authored by Barbara Ringer, has cast an outsized influence on discussions of the U.S. 1923-1963 public domain. As more concrete data emerge from initiatives such as the large-scale determination process in the Copyright Review Management System (CRMS) project, questions are raised about the reliability…

  9. Public-Resource Computing: Un nuevo paradigma para la computación y la ciencia

    OpenAIRE

    2006-01-01

    En este artículo se explora el concepto de Computación de Recursos Públicos (Public-Resource Computing), una idea que se ha venido desarrollando con gran éxito desde hace algunos años en la comunidad científica y que consiste en el aprovechamiento de los recursos de computación que se encuentran disponibles en los millones de PC que existen en el mundo conectados a internet. Se discute el proyecto SETI@home, el más exitoso representante de este concepto, y se describe la plataforma BOINC (Ber...

  10. Public vs Private vs Hybrid vs Community - Cloud Computing: A Critical Review

    Directory of Open Access Journals (Sweden)

    Sumit Goyal

    2014-02-01

    Full Text Available These days cloud computing is booming like no other technology. Every organization whether it's small, mid-sized or big, wants to adapt this cutting edge technology for its business. As cloud technology becomes immensely popular among these businesses, the question arises: Which cloud model to consider for your business? There are four types of cloud models available in the market: Public, Private, Hybrid and Community. This review paper answers the question, which model would be most beneficial for your business. All the four models are defined, discussed and compared with the benefits and pitfalls, thus giving you a clear idea, which model to adopt for your organization.

  11. FCJ-133 The Scripted Spaces of Urban Ubiquitous Computing: The experience, poetics, and politics of public scripted space

    Directory of Open Access Journals (Sweden)

    Christian Ulrik Andersen

    2011-12-01

    Full Text Available This article proposes and introduces the concept of ‘scripted space’ as a new perspective on ubiquitous computing in urban environments. Drawing on urban history, computer games, and a workshop study of the city of Lund the article discusses the experience of digitally scripted spaces, and their relation to the history of public spaces. In conclusion, the article discusses the potential for employing scripted spaces as a reinvigoration of urban public space.

  12. Computational modeling of optical projection tomographic microscopy using the finite difference time domain method.

    Science.gov (United States)

    Coe, Ryan L; Seibel, Eric J

    2012-12-01

    We present a method for modeling image formation in optical projection tomographic microscopy (OPTM) using high numerical aperture (NA) condensers and objectives. Similar to techniques used in computed tomography, OPTM produces three-dimensional, reconstructed images of single cells from two-dimensional projections. The model is capable of simulating axial scanning of a microscope objective to produce projections, which are reconstructed using filtered backprojection. Simulation of optical scattering in transmission optical microscopy is designed to analyze all aspects of OPTM image formation, such as degree of specimen staining, refractive-index matching, and objective scanning. In this preliminary work, a set of simulations is performed to examine the effect of changing the condenser NA, objective scan range, and complex refractive index on the final reconstruction of a microshell with an outer radius of 1.5 μm and an inner radius of 0.9 μm. The model lays the groundwork for optimizing OPTM imaging parameters and triaging efforts to further improve the overall system design. As the model is expanded in the future, it will be used to simulate a more realistic cell, which could lead to even greater impact.

  13. Computer Catalog and Semantic Search of Data in the Domain of Cast Iron Processing

    Directory of Open Access Journals (Sweden)

    Rojek G.

    2017-06-01

    Full Text Available The aim of this study is to design and implement a computer system, which will allow the semantic cataloging and data retrieval in the field of cast iron processing. The intention is to let the system architecture allow for consideration of data on various processing techniques based on the information available or searched by a potential user. This is achieved by separating the system code from the knowledge of the processing operations or from the chemical composition of the material being processed. This is made possible by the creation and subsequent use of formal knowledge representation in the form of ontology. So, any use of the system is associated with the use of ontologies, either as an aid for the cataloging of new data, or as an indication of restrictions imposed on the data which draw user attention. The use of formal knowledge representation also allows consideration of semantic meaning, a consequence of which may be, for example, returning all elements in subclasses of the searched process class or material grade.

  14. A novel image-domain-based cone-beam computed tomography enhancement algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Li Xiang; Li Tianfang; Yang Yong; Heron, Dwight E; Huq, M Saiful, E-mail: lix@upmc.edu [Department of Radiation Oncology, University of Pittsburgh Cancer Institute, Pittsburgh, PA 15232 (United States)

    2011-05-07

    Kilo-voltage (kV) cone-beam computed tomography (CBCT) plays an important role in image-guided radiotherapy. However, due to a large cone-beam angle, scatter effects significantly degrade the CBCT image quality and limit its clinical application. The goal of this study is to develop an image enhancement algorithm to reduce the low-frequency CBCT image artifacts, which are also called the bias field. The proposed algorithm is based on the hypothesis that image intensities of different types of materials in CBCT images are approximately globally uniform (in other words, a piecewise property). A maximum a posteriori probability framework was developed to estimate the bias field contribution from a given CBCT image. The performance of the proposed CBCT image enhancement method was tested using phantoms and clinical CBCT images. Compared to the original CBCT images, the corrected images using the proposed method achieved a more uniform intensity distribution within each tissue type and significantly reduced cupping and shading artifacts. In a head and a pelvic case, the proposed method reduced the Hounsfield unit (HU) errors within the region of interest from 300 HU to less than 60 HU. In a chest case, the HU errors were reduced from 460 HU to less than 110 HU. The proposed CBCT image enhancement algorithm demonstrated a promising result by the reduction of the scatter-induced low-frequency image artifacts commonly encountered in kV CBCT imaging.

  15. Frequency-domain analysis of computer-controlled optical surfacing processes

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Mid-high spatial frequency errors are often induced on optical surfaces polished by computer-controlled optical surfacing (CCOS) processes. In order to efficiently remove these errors, which would degrade the performances of optical systems, the ability of a CCOS process to correct the errors have been investigated based on the convolution integral model in view of the availability of material removal. To quantify the ability, some conceptions, such as figure correcting ability and material removal availability (MRA), have been proposed. The research result reveals that the MRA of the CCOS process to correct a single spatial frequency error is determined by its tool removal function (TRF), and it equals the normalized amplitude spectrum of the Fourier transform of its TRF. Finally, three sine surfaces were etched using ion beam figuring (IBF), which is a typical CCOS process. The experimental results have verified the theoretical analysis. The employed method and the conclusions of this work provide a useful mathematical basis to analyze and optimize CCOS processes.

  16. FEM-BEM coupling methods for Tokamak plasma axisymmetric free-boundary equilibrium computations in unbounded domains

    Science.gov (United States)

    Faugeras, Blaise; Heumann, Holger

    2017-08-01

    Incorporating boundary conditions at infinity into simulations on bounded computational domains is a repeatedly occurring problem in scientific computing. The combination of finite element methods (FEM) and boundary element methods (BEM) is the obvious instrument, and we adapt here for the first time the two standard FEM-BEM coupling approaches to the free-boundary equilibrium problem: the Johnson-Nédélec coupling and the Bielak-MacCamy coupling. We recall also the classical approach for fusion applications, dubbed according to its first appearance von-Hagenow-Lackner coupling and present the less used alternative introduced by Albanese, Blum and de Barbieri in [2]. We show that the von-Hagenow-Lackner coupling suffers from undesirable non-optimal convergence properties, that suggest that other coupling schemes, in particular Johnson-Nédélec or Albanese-Blum-de Barbieri are more appropriate for non-linear equilibrium problems. Moreover, we show that any of such coupling methods requires Newton-like iteration schemes for solving the corresponding non-linear discrete algebraic systems.

  17. The Impact and Challenges of Cloud Computing Adoption on Public Universities in Southwestern Nigeria

    Directory of Open Access Journals (Sweden)

    Oyeleye Christopher Akin

    2014-08-01

    Full Text Available This study investigates the impact and challenges of the adoption of cloud computing by public universities in the Southwestern part of Nigeria. A sample size of 100 IT staff, 50 para-IT staff and 50 students were selected in each university using stratified sampling techniques with the aid of well-structured questionnaires. Microsoft excel was used to capture the data while frequency and percentage distributions were used to analyze it. In all, 2, 000 copies of the questionnaire were administered to the ten (10 public universities in the southwestern part of Nigeria while 1742 copies were returned which represents a respondent rate of 87.1%. The result of the findings revealed that the adoption of cloud computing has a significant impact on cost effectiveness, enhanced availability, low environmental impact, reduced IT complexities, mobility, scalability, increased operability and reduced investment in physical asset However, the major challenges confronting the adoption of cloud are data insecurity, regulatory compliance concerns, lock-in and privacy concerns. This paper concludes by recommending strategies to manage the identified challenges in the study area.

  18. Public library computer training for older adults to access high-quality Internet health information.

    Science.gov (United States)

    Xie, Bo; Bugg, Julie M

    2009-09-01

    An innovative experiment to develop and evaluate a public library computer training program to teach older adults to access and use high-quality Internet health information involved a productive collaboration among public libraries, the National Institute on Aging and the National Library of Medicine of the National Institutes of Health (NIH), and a Library and Information Science (LIS) academic program at a state university. One hundred and thirty-one older adults aged 54-89 participated in the study between September 2007 and July 2008. Key findings include: a) participants had overwhelmingly positive perceptions of the training program; b) after learning about two NIH websites (http://nihseniorhealth.gov and http://medlineplus.gov) from the training, many participants started using these online resources to find high quality health and medical information and, further, to guide their decision-making regarding a health- or medically-related matter; and c) computer anxiety significantly decreased (p libraries, LIS academic programs, and other organizations interested in providing similar programs in their communities.

  19. High-performance parallel computing in the classroom using the public goods game as an example

    Science.gov (United States)

    Perc, Matjaž

    2017-07-01

    The use of computers in statistical physics is common because the sheer number of equations that describe the behaviour of an entire system particle by particle often makes it impossible to solve them exactly. Monte Carlo methods form a particularly important class of numerical methods for solving problems in statistical physics. Although these methods are simple in principle, their proper use requires a good command of statistical mechanics, as well as considerable computational resources. The aim of this paper is to demonstrate how the usage of widely accessible graphics cards on personal computers can elevate the computing power in Monte Carlo simulations by orders of magnitude, thus allowing live classroom demonstration of phenomena that would otherwise be out of reach. As an example, we use the public goods game on a square lattice where two strategies compete for common resources in a social dilemma situation. We show that the second-order phase transition to an absorbing phase in the system belongs to the directed percolation universality class, and we compare the time needed to arrive at this result by means of the main processor and by means of a suitable graphics card. Parallel computing on graphics processing units has been developed actively during the last decade, to the point where today the learning curve for entry is anything but steep for those familiar with programming. The subject is thus ripe for inclusion in graduate and advanced undergraduate curricula, and we hope that this paper will facilitate this process in the realm of physics education. To that end, we provide a documented source code for an easy reproduction of presented results and for further development of Monte Carlo simulations of similar systems.

  20. Sound attenuation analysis of water-filled perforated pipe silencers using three-dimensional time-domain computational fluid dynamics approach

    Directory of Open Access Journals (Sweden)

    Xu Zhou

    2016-04-01

    Full Text Available The three-dimensional time-domain computational fluid dynamics approach is employed to calculate and analyze the sound attenuation behavior of water-filled perforated pipe silencers. Transmission loss predictions from the time-domain computational fluid dynamics approach and the frequency-domain finite element method agree well with each other for the straight-through and cross-flow perforated pipe silencers without flow. Then, the time-domain computational fluid dynamics approach is used to investigate the effects of flow velocity, diameter, and porosity of orifices on the sound attenuation behavior of the silencers. The numerical predictions demonstrate that the flow increases the transmission loss, especially at high frequencies. Based on the above analysis, partially plugged straight-through perforated pipe silencer is proposed to improve the sound attenuation performance by increasing the flow velocity through the orifices. In order to eliminate the pass frequency of the perforated pipe silencers and improve the sound attenuation performance in mid- to high-frequency range, a folded straight-through perforated pipe silencer is designed and its sound attenuation behavior is analyzed numerically using the time-domain computational fluid dynamics approach.

  1. Digital utopias and real cities—computer-generated images in re-design of public space

    Directory of Open Access Journals (Sweden)

    Marianna Michałowska

    2015-12-01

    Full Text Available Virtual environments are seen nowadays as extensions of our physical activities in the city. Are people, however, aware what the digitally mediated cities they live in are? The starting point of my paper is a question of how computer-generated images (CGIs influence human perception of real space. I am interested in the conflict between a vision and reality, which occurs when an architectural project materialises in a public space and is subsequently rejected by the inhabitants. To explain this conflict, I will use the notion of digital utopias and compare CGIs with the great tradition of “paper” architecture. I will analyse two case studies from a medium-sized Polish city—Poznań. The first case is a redevelopment of the Main Railway Station; the second is a re-design of a local square in Poznań. The analysis focuses on the ambiguity of CGIs used to advertise new investments. The Station in the phase of digital visualisation was appreciated by the Poznań inhabitants but when the project was finally realised, strong criticism of its users followed. The second one provoked public protests already in the phase of visualisation. In conclusion, I state that the concept of agonistic public spaces should be expanded and its virtual dimension should be taken into consideration as well. When dealing with hyper-realistic CGIs, we experience a certain utopia. Confronted with their material execution, we often experience dystopian disillusion which stirs us into action.

  2. Estimating and modelling bias of the hierarchical partitioning public-domain software: implications in environmental management and conservation.

    Directory of Open Access Journals (Sweden)

    Pedro P Olea

    Full Text Available BACKGROUND: Hierarchical partitioning (HP is an analytical method of multiple regression that identifies the most likely causal factors while alleviating multicollinearity problems. Its use is increasing in ecology and conservation by its usefulness for complementing multiple regression analysis. A public-domain software "hier.part package" has been developed for running HP in R software. Its authors highlight a "minor rounding error" for hierarchies constructed from >9 variables, however potential bias by using this module has not yet been examined. Knowing this bias is pivotal because, for example, the ranking obtained in HP is being used as a criterion for establishing priorities of conservation. METHODOLOGY/PRINCIPAL FINDINGS: Using numerical simulations and two real examples, we assessed the robustness of this HP module in relation to the order the variables have in the analysis. Results indicated a considerable effect of the variable order on the amount of independent variance explained by predictors for models with >9 explanatory variables. For these models the nominal ranking of importance of the predictors changed with variable order, i.e. predictors declared important by its contribution in explaining the response variable frequently changed to be either most or less important with other variable orders. The probability of changing position of a variable was best explained by the difference in independent explanatory power between that variable and the previous one in the nominal ranking of importance. The lesser is this difference, the more likely is the change of position. CONCLUSIONS/SIGNIFICANCE: HP should be applied with caution when more than 9 explanatory variables are used to know ranking of covariate importance. The explained variance is not a useful parameter to use in models with more than 9 independent variables. The inconsistency in the results obtained by HP should be considered in future studies as well as in those

  3. Domains of quality of life: results of a three-stage Delphi consensus procedure among patients, family of patients, clinicians, scientists and the general public.

    Science.gov (United States)

    Pietersma, Suzanne; de Vries, Marieke; van den Akker-van Marle, M Elske

    2014-06-01

    Our key objective is to identify the core domains of health-related quality of life (QoL). Health-related QoL utility scales are commonly used in economic evaluations to assess the effectiveness of health-care interventions. However, health-care interventions are likely to affect QoL in a broader sense than is quantifiable with traditional scales. Therefore, measures need to go beyond these scales. Unfortunately, there is no consensus in the scientific literature on the essential domains of QoL. We conducted a three-stage online Delphi consensus procedure to identify the key domains of health-related QoL. Five stakeholder groups (i.e., patients, family of patients, clinicians, scientists and general public) were asked, on three consecutive occasions, what they perceive as the most important domains of health-related QoL. An analysis of existing (health-related) QoL and well-being measurements formed the basis of the Delphi-procedure. In total, 42 domains of QoL were judged, covering physical, mental and social aspects. All participants rated 'self-acceptance', 'self-esteem' and 'good social contacts' as essential. Strikingly, mental and social domains are perceived as more essential than physical domains across stakeholders groups. In traditionally used health-related QoL utility measures, physical domains like 'mobility' are prominently present. The Delphi-procedure shows that health-related QoL (utility) scales need to put sufficient emphasis on mental and social domains to capture aspects of QoL that are essential to people.

  4. Computing what the public wants: some issues in road safety cost-benefit analysis.

    Science.gov (United States)

    Hauer, Ezra

    2011-01-01

    In road safety, as in other fields, cost-benefit analysis (CBA) is used to justify the investment of public money and to establish priority between projects. It amounts to a computation by which 'few' - the CB analysts - aim to determine what the 'many' - those on behalf of which the choice is to be made - would choose. The question is whether there are grounds to believe that the tool fits the aim. I argue that the CBA tool is deficient. First, because estimates of the value of statistical life and injury on which the CBA computation rests are all over the place, inconsistent with the value of time estimates, and government guidance on the matter appears to be arbitrary. Second, because the premises of New Welfare Economics on which the CBA is founded apply only in circumstances which, in road safety, are rare. Third, because the CBA requires the computation of present values which must be questioned when the discounting is of future lives and of time. Because time savings are valued too highly when compared to life and because discounting tends to unjustifiably diminish the value of lives saved in the future, the CBA tends to bias decisions against investment in road safety.

  5. ICRP Publication 116—the first ICRP/ICRU application of the male and female adult reference computational phantoms

    CERN Document Server

    Petoussi-Henss, Nina; Eckerman, Keith F; Endo, Akira; Hertel, Nolan; Hunt, John; Menzel, Hans G; Pelliccioni, Maurizio; Schlattl, Helmut; Zankl, Maria

    2014-01-01

    ICRP Publication 116 on `Conversion coefficients for radiological protection quantities for external radiation exposures', provides fluence-to-dose conversion coefficients for organ-absorbed doses and effective dose for various types of external exposures (ICRP 2010 ICRP Publication 116). The publication supersedes the ICRP Publication 74 (ICRP 1996 ICRP Publication 74, ICRU 1998 ICRU Report 57), including new particle types and expanding the energy ranges considered. The coefficients were calculated using the ICRP/ICRU computational phantoms (ICRP 2009 ICRP Publication 110) representing the reference adult male and reference adult female (ICRP 2002 ICRP Publication 89), together with a variety of Monte Carlo codes simulating the radiation transport in the body. Idealized whole-body irradiation from unidirectional and rotational parallel beams as well as isotropic irradiation was considered for a large variety of incident radiations and energy ranges. Comparison of the effective doses with operational quantit...

  6. Energy Efficiency in Public Buildings through Context-Aware Social Computing

    Science.gov (United States)

    García, Óscar; Alonso, Ricardo S.; Prieto, Javier; Corchado, Juan M.

    2017-01-01

    The challenge of promoting behavioral changes in users that leads to energy savings in public buildings has become a complex task requiring the involvement of multiple technologies. Wireless sensor networks have a great potential for the development of tools, such as serious games, that encourage acquiring good energy and healthy habits among users in the workplace. This paper presents the development of a serious game using CAFCLA, a framework that allows for integrating multiple technologies, which provide both context-awareness and social computing. Game development has shown that the data provided by sensor networks encourage users to reduce energy consumption in their workplace and that social interactions and competitiveness allow for accelerating the achievement of good results and behavioral changes that favor energy savings. PMID:28398237

  7. A user-friendly SSVEP-based brain-computer interface using a time-domain classifier

    Science.gov (United States)

    Luo, An; Sullivan, Thomas J.

    2010-04-01

    We introduce a user-friendly steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI) system. Single-channel EEG is recorded using a low-noise dry electrode. Compared to traditional gel-based multi-sensor EEG systems, a dry sensor proves to be more convenient, comfortable and cost effective. A hardware system was built that displays four LED light panels flashing at different frequencies and synchronizes with EEG acquisition. The visual stimuli have been carefully designed such that potential risk to photosensitive people is minimized. We describe a novel stimulus-locked inter-trace correlation (SLIC) method for SSVEP classification using EEG time-locked to stimulus onsets. We studied how the performance of the algorithm is affected by different selection of parameters. Using the SLIC method, the average light detection rate is 75.8% with very low error rates (an 8.4% false positive rate and a 1.3% misclassification rate). Compared to a traditional frequency-domain-based method, the SLIC method is more robust (resulting in less annoyance to the users) and is also suitable for irregular stimulus patterns.

  8. Associations between neck musculoskeletal complaints and work related factors among public service computer workers in Kaunas

    Directory of Open Access Journals (Sweden)

    Gintaré Kaliniene

    2013-10-01

    Full Text Available Objectives:Information technologies have been developing very rapidly, also in the case of occupational activities. Epidemiological studies have shown that employees, who work with computers, are more likely to complain of musculoskeletal disorders (MSD. The aim of this study was to evaluate associations between neck MSD and individual and work related factors. Materials and Methods: The investigation which consisted of two parts - a questionnaire study (using Nordic Musculoskeletal questionnaire and Copenhagen Psychosocial Questionnaire and a direct observation (to evaluate ergonomic work environment using RULA method was carried out in three randomly selected public sector companies of Kaunas. The study population consisted of 513 public service office workers. Results: The survey showed that neck MSDs were very common in the investigated population. The prevalence rate amounted to 65.7%. According to our survey neck MSDs were significantly associated with older age, bigger work experience, high quantitative and cognitive job demands, working for longer than 2 h without taking a break as well as with higher ergonomic risk score. The fully adjusted model working for longer than 2 h without taking a break had the strongest associations with neck complaints. Conclusion: It was confirmed, that neck MSDs were significantly associated with individual factors as well as conditions of work, therefore, preventive acions against neck complaints should be oriented at psychosocial and ergonomic work environment as well as at individual factors.

  9. Combining Public Domain and Professional Panoramic Imagery for the Accurate and Dense 3d Reconstruction of the Destroyed Bel Temple in Palmyra

    Science.gov (United States)

    Wahbeh, W.; Nebiker, S.; Fangi, G.

    2016-06-01

    This paper exploits the potential of dense multi-image 3d reconstruction of destroyed cultural heritage monuments by either using public domain touristic imagery only or by combining the public domain imagery with professional panoramic imagery. The focus of our work is placed on the reconstruction of the temple of Bel, one of the Syrian heritage monuments, which was destroyed in September 2015 by the so called "Islamic State". The great temple of Bel is considered as one of the most important religious buildings of the 1st century AD in the East with a unique design. The investigations and the reconstruction were carried out using two types of imagery. The first are freely available generic touristic photos collected from the web. The second are panoramic images captured in 2010 for documenting those monuments. In the paper we present a 3d reconstruction workflow for both types of imagery using state-of-the art dense image matching software, addressing the non-trivial challenges of combining uncalibrated public domain imagery with panoramic images with very wide base-lines. We subsequently investigate the aspects of accuracy and completeness obtainable from the public domain touristic images alone and from the combination with spherical panoramas. We furthermore discuss the challenges of co-registering the weakly connected 3d point cloud fragments resulting from the limited coverage of the touristic photos. We then describe an approach using spherical photogrammetry as a virtual topographic survey allowing the co-registration of a detailed and accurate single 3d model of the temple interior and exterior.

  10. Exploration of Preterm Birth Rates Using the Public Health Exposome Database and Computational Analysis Methods

    Directory of Open Access Journals (Sweden)

    Anne D. Kershenbaum

    2014-11-01

    Full Text Available Recent advances in informatics technology has made it possible to integrate, manipulate, and analyze variables from a wide range of scientific disciplines allowing for the examination of complex social problems such as health disparities. This study used 589 county-level variables to identify and compare geographical variation of high and low preterm birth rates. Data were collected from a number of publically available sources, bringing together natality outcomes with attributes of the natural, built, social, and policy environments. Singleton early premature county birth rate, in counties with population size over 100,000 persons provided the dependent variable. Graph theoretical techniques were used to identify a wide range of predictor variables from various domains, including black proportion, obesity and diabetes, sexually transmitted infection rates, mother’s age, income, marriage rates, pollution and temperature among others. Dense subgraphs (paracliques representing groups of highly correlated variables were resolved into latent factors, which were then used to build a regression model explaining prematurity (R-squared = 76.7%. Two lists of counties with large positive and large negative residuals, indicating unusual prematurity rates given their circumstances, may serve as a starting point for ways to intervene and reduce health disparities for preterm births.

  11. Integrating NASA's Land Analysis System (LAS) image processing software with an appropriate Geographic Information System (GIS): A review of candidates in the public domain

    Science.gov (United States)

    Rochon, Gilbert L.

    1989-01-01

    A user requirements analysis (URA) was undertaken to determine and appropriate public domain Geographic Information System (GIS) software package for potential integration with NASA's LAS (Land Analysis System) 5.0 image processing system. The necessity for a public domain system was underscored due to the perceived need for source code access and flexibility in tailoring the GIS system to the needs of a heterogenous group of end-users, and to specific constraints imposed by LAS and its user interface, Transportable Applications Executive (TAE). Subsequently, a review was conducted of a variety of public domain GIS candidates, including GRASS 3.0, MOSS, IEMIS, and two university-based packages, IDRISI and KBGIS. The review method was a modified version of the GIS evaluation process, development by the Federal Interagency Coordinating Committee on Digital Cartography. One IEMIS-derivative product, the ALBE (AirLand Battlefield Environment) GIS, emerged as the most promising candidate for integration with LAS. IEMIS (Integrated Emergency Management Information System) was developed by the Federal Emergency Management Agency (FEMA). ALBE GIS is currently under development at the Pacific Northwest Laboratory under contract with the U.S. Army Corps of Engineers' Engineering Topographic Laboratory (ETL). Accordingly, recommendations are offered with respect to a potential LAS/ALBE GIS linkage and with respect to further system enhancements, including coordination with the development of the Spatial Analysis and Modeling System (SAMS) GIS in Goddard's IDM (Intelligent Data Management) developments in Goddard's National Space Science Data Center.

  12. Public-key Encryption Based on Extending Discrete Chebyshev Polynomials' Definition Domain to Real Number%基于实数域扩散离散Chebyshev多项式的公钥加密算法

    Institute of Scientific and Technical Information of China (English)

    陈宇; 韦鹏程

    2011-01-01

    将Chebyshev多项式与模运算相结合,对其定义在实数域上进行了扩展,经过理论验证和数据分析,总结出实数域多项式应用于公钥密码的一些性质.利用RSA公钥算法和EIGamal公钥算法的算法结构,提出基于有限域离散Chebyshev多项式的公钥密码算法.该算法结构类似于RSA算法,其安全性基于大数因式分解的难度或者与El-Gamal的离散对数难度相当,能够抵抗对于RSA的选择密文攻击,并且易于软件实现.%By combining Chebyshev polynomials with modulus compute,extending Chebyshev polynomials' definition domain to real number, some conclusions were drawn by theoretic verification and data analysis. Making use of the framework of the traditional public-key algorithm RSA and ElGamal, proposed a chaotic public-key encryption algorithm based on extending discrete Chebyshev polynomials' definition domain to Real number. Its security is based on the intractability of the integer factorization problem as RSA,and it is able to resist the chosen cipher-text attack against RSA and easy to be implemented.

  13. Computer analysis of antigenic domains and RGD-like sequences (RGWG) in the E glycoprotein of flaviviruses: an approach to vaccine development.

    Science.gov (United States)

    Becker, Y

    1990-09-01

    Antigenic domains and RGD-like sequences in the E glycoprotein of the flaviviruses Japanese encephalitis virus, yellow fever virus, West Nile virus, dengue type 4 virus, and tick-borne encephalitis virus were analyzed by computer programs that provide information on the physical properties of the polypeptides. The use of computer programs for the development of vaccines based on the synthesis of antigenic peptides is discussed. Synthetic viral peptides are proposed to be used for topical application so as to interfere with the virus-cell interaction. Viral peptides with antigenic epitopes to protect against dengue virus infection without enhancing pathogenesis may also be developed on the basis of the computer analysis.

  14. Public-Private Partnerships in Cloud-Computing Services in the Context of Genomic Research.

    Science.gov (United States)

    Granados Moreno, Palmira; Joly, Yann; Knoppers, Bartha Maria

    2017-01-01

    Public-private partnerships (PPPs) have been increasingly used to spur and facilitate innovation in a number of fields. In healthcare, the purpose of using a PPP is commonly to develop and/or provide vaccines and drugs against communicable diseases, mainly in developing or underdeveloped countries. With the advancement of technology and of the area of genomics, these partnerships also focus on large-scale genomic research projects that aim to advance the understanding of diseases that have a genetic component and to develop personalized treatments. This new focus has created new forms of PPPs that involve information technology companies, which provide computing infrastructure and services to store, analyze, and share the massive amounts of data genomic-related projects produce. In this article, we explore models of PPPs proposed to handle, protect, and share the genomic data collected and to further develop genomic-based medical products. We also identify the reasons that make these models suitable and the challenges they have yet to overcome. To achieve this, we describe the details and complexities of MSSNG, International Cancer Genome Consortium, and 100,000 Genomes Project, the three PPPs that focus on large-scale genomic research to better understand the genetic components of autism, cancer, rare diseases, and infectious diseases with the intention to find appropriate treatments. Organized as PPP and employing cloud-computing services, the three projects have advanced quickly and are likely to be important sources of research and development for future personalized medicine. However, there still are unresolved matters relating to conflicts of interest, commercialization, and data control. Learning from the challenges encountered by past PPPs allowed us to establish that developing guidelines to adequately manage personal health information stored in clouds and ensuring the protection of data integrity and privacy would be critical steps in the development of

  15. Configuration of the catalytic GIY-YIG domain of intron endonuclease I-TevI: coincidence of computational and molecular findings.

    Science.gov (United States)

    Kowalski, J C; Belfort, M; Stapleton, M A; Holpert, M; Dansereau, J T; Pietrokovski, S; Baxter, S M; Derbyshire, V

    1999-05-15

    I-TevI is a member of the GIY-YIG family of homing endonucleases. It is folded into two structural and functional domains, an N-terminal catalytic domain and a C-terminal DNA-binding domain, separated by a flexible linker. In this study we have used genetic analyses, computational sequence analysis andNMR spectroscopy to define the configuration of theN-terminal domain and its relationship to the flexible linker. The catalytic domain is an alpha/beta structure contained within the first 92 amino acids of the 245-amino acid protein followed by an unstructured linker. Remarkably, this structured domain corresponds precisely to the GIY-YIG module defined by sequence comparisons of 57 proteins including more than 30 newly reported members of the family. Although much of the unstructured linker is not essential for activity, residues 93-116 are required, raising the possibility that this region may adopt an alternate conformation upon DNA binding. Two invariant residues of the GIY-YIG module, Arg27 and Glu75, located in alpha-helices, have properties of catalytic residues. Furthermore, the GIY-YIG sequence elements for which the module is named form part of a three-stranded antiparallel beta-sheet that is important for I-TevI structure and function.

  16. PUBLIC LINEAR PROGRAMMING SOLUTION FOR THE DESIGN OF SECURE AND EFFICIENT COMPUTING IN CLOUD

    Directory of Open Access Journals (Sweden)

    Dr.R.V.Krishnaiah

    2013-09-01

    Full Text Available This next generation of computing holds enormous potential to stimulate economic growth and enable governments to reduce costs, increase transparency and expand services to citizens. Cloud computing robust computational power to the society at reduced cost and enables customers with limited computational resources to outsource their large computation workloads to the cloud, and economically enjoy the massive computational power, bandwidth, storage, and even appropriate software that can be shared in a pay-per-use manner. Despite the tremendous benefits, security is the primary obstacle that prevents the wide adoption of this promising computing model, especially for customers when their confidential data are consumed and produced during the computation.

  17. Awareness of Accessibility Barriers in Computer-Based Instructional Materials and Faculty Demographics at South Dakota Public Universities

    Science.gov (United States)

    Olson, Christopher

    2013-01-01

    Advances in technology and course delivery methods have enabled persons with disabilities to enroll in higher education at an increasing rate. Federal regulations state persons with disabilities must be granted equal access to the information contained in computer-based instructional materials, but faculty at the six public universities in South…

  18. Computer-aided detection of pulmonary nodules: a comparative study using the public LIDC/IDRI database

    NARCIS (Netherlands)

    Jacobs, C.; Rikxoort, E.M. van; Murphy, K.; Prokop, M.; Schaefer-Prokop, C.M.; Ginneken, B. van

    2016-01-01

    To benchmark the performance of state-of-the-art computer-aided detection (CAD) of pulmonary nodules using the largest publicly available annotated CT database (LIDC/IDRI), and to show that CAD finds lesions not identified by the LIDC's four-fold double reading process.The LIDC/IDRI database

  19. Predicting the Number of Public Computer Terminals Needed for an On-Line Catalog: A Queuing Theory Approach.

    Science.gov (United States)

    Knox, A. Whitney; Miller, Bruce A.

    1980-01-01

    Describes a method for estimating the number of cathode ray tube terminals needed for public use of an online library catalog. Authors claim method could also be used to estimate needed numbers of microform readers for a computer output microform (COM) catalog. Formulae are included. (Author/JD)

  20. The national public's values and interests related to the Arctic National Wildlife Refuge: A computer content analysis

    Science.gov (United States)

    David N. Bengston; David P. Fan; Roger. Kaye

    2010-01-01

    This study examined the national public's values and interests related to the Arctic National Wildlife Refuge. Computer content analysis was used to analyze more than 23,000 media stories about the refuge from 1995 through 2007. Ten main categories of Arctic National Wildlife Refuge values and interests emerged from the analysis, reflecting a diversity of values,...

  1. Measuring Technological, Organizational and Environmental Factors Influencing the Adoption Intentions of Public Cloud Computing Using a Proposed Integrated Model

    Directory of Open Access Journals (Sweden)

    Minimol Anil Job

    2016-05-01

    Full Text Available The main objective of this research is to identify the factors influencing the intentions to adopt the public computing by the private sector firms. In this research the researcher examined the ten factors influencing the cloud computing adoption using a proposed integrated model which incorporates aspects of the Technology, Organization and Environment factors such as Complexity, Compatibility, Security Concerns, Trialability, Cost Saving, Top Management Support, Prior IT Experience, Organizational Readiness, Competitive Pressure and External Support. In order to test influencing factors a survey was conducted and one hundred and twenty two valid responses were received from IT decision makers from forty firms in different industries. The results revealed that the Compatibility, Cost Saving, Trialability and External Support are the main influential factors in the adoption intentions of public cloud computing. Future research could be built on this study by developing different model for each industry because each industry has unique characteristics that can influence the adoption of the technological innovations.

  2. Pre-feasibility Study of Astronomical Data Archive Systems Powered by Public Cloud Computing and Hadoop Hive

    CERN Document Server

    Eguchi, Satoshi

    2016-01-01

    The size of astronomical observational data is increasing yearly. For example, while Atacama Large Millimeter/submillimeter Array is expected to generate 200 TB raw data every year, Large Synoptic Survey Telescope is estimated to produce 15 TB raw data every night. Since the increasing rate of computing is much lower than that of astronomical data, to provide high performance computing (HPC) resources together with scientific data will be common in the next decade. However, the installation and maintenance costs of a HPC system can be burdensome for the provider. I note public cloud computing for an alternative way to get sufficient computing resources inexpensively. I build Hadoop and Hive clusters by utilizing a virtual private server (VPS) service and Amazon Elastic MapReduce (EMR), and measure their performances. The VPS cluster behaves differently day by day, while the EMR clusters are relatively stable. Since partitioning is essential for Hive, several partitioning algorithms are evaluated. In this pape...

  3. 报刊公共领域与邹韬奋%The public domain of newspaper and journal and Zou Taofen

    Institute of Scientific and Technical Information of China (English)

    董亚秋

    2012-01-01

      众所周知,邹韬奋是我国卓越的新闻记者、出版家,更是一名进步爱国的政论家。他主编的《生活》周刊、《大众生活》和《全民抗战》刊物,由于其客观的立场、报刊的公共性和其注重与读者平等的讨论,构成了报刊公共领域。作为这一公共领域中的发言人角色,邹韬奋积极发表新闻评论,设立读者信箱和小言论等精品栏目,与读者和同仁进行交流。这些编辑出版活动充分展现了邹韬奋在报刊公共空间中的舆论引导作用。%  As we all know, Zou Tao-fen is an excellent journalist, publishing house, was a progressive patriotic political commentators. He published in life magazine, the public and the National Journal of the war, due to its objective of publicity and its focus on position, press and reader discussions on an equal footing, forming the newspaper of the public domain. Role as spokesman in this public area, Zou Tao-fen news positive comments, set up reader mail and small boutique, such as columns, communicate with colleagues and readers. These activities have fully demonstrated Zou taofen's editing and publishing in the newspaper public opinion to guide the role of the public space.

  4. Engineering a more thermostable blue light photo receptor Bacillus subtilis YtvA LOV domain by a computer aided rational design method.

    Directory of Open Access Journals (Sweden)

    Xiangfei Song

    Full Text Available The ability to design thermostable proteins offers enormous potential for the development of novel protein bioreagents. In this work, a combined computational and experimental method was developed to increase the T m of the flavin mononucleotide based fluorescent protein Bacillus Subtilis YtvA LOV domain by 31 Celsius, thus extending its applicability in thermophilic systems. Briefly, the method includes five steps, the single mutant computer screening to identify thermostable mutant candidates, the experimental evaluation to confirm the positive selections, the computational redesign around the thermostable mutation regions, the experimental reevaluation and finally the multiple mutations combination. The adopted method is simple and effective, can be applied to other important proteins where other methods have difficulties, and therefore provides a new tool to improve protein thermostability.

  5. BPO crude oil analysis data base user`s guide: Methods, publications, computer access correlations, uses, availability

    Energy Technology Data Exchange (ETDEWEB)

    Sellers, C.; Fox, B.; Paulz, J.

    1996-03-01

    The Department of Energy (DOE) has one of the largest and most complete collections of information on crude oil composition that is available to the public. The computer program that manages this database of crude oil analyses has recently been rewritten to allow easier access to this information. This report describes how the new system can be accessed and how the information contained in the Crude Oil Analysis Data Bank can be obtained.

  6. 77 FR 22326 - Privacy Act of 1974, as Amended by Public Law 100-503; Notice of a Computer Matching Program

    Science.gov (United States)

    2012-04-13

    ... HUMAN SERVICES Administration for Children and Families Privacy Act of 1974, as Amended by Public Law... 1974, as amended by Public Law 100-503. SUMMARY: In compliance with the Privacy Act of 1974, as amended by Public Law 100-503, the Computer Matching and Privacy Protection Act of 1988, ACF is publishing...

  7. The sailor, the turtle and the jungle man - striking the balance between protection and public domain in fictional character merchandising

    OpenAIRE

    Preiss LL.M., Sven

    2013-01-01

    What is it that ‘Popeye the Sailor’, the ‘Teenage Mutant Hero Turtles’ and ‘Tarzan’ have in common? Besides being well-known fictitious characters, each of them is overwhelmingly successful in terms of entertainment (for the public) and revenue (for the industry behind it). They exemplify the possibility of the fictional characters’ owners not only using them for their basic purpose, i.e. in books, comics, movies, broadcasts, etc., but also of secondary exploitation of the characters’ gained ...

  8. Final Technical Report - Publication and Retrieval of Computational Chemical-Physical Data Via the Semantic Web

    Energy Technology Data Exchange (ETDEWEB)

    Ostlund, Neil [Chemical Semantics, Inc.,Gainesville, FL (United States)

    2017-07-20

    This research showed the feasibility of applying the concepts of the Semantic Web to Computation Chemistry. We have created the first web portal (www.chemsem.com) that allows data created in the calculations of quantum chemistry, and other such chemistry calculations to be placed on the web in a way that makes the data accessible to scientists in a semantic form never before possible. The semantic web nature of the portal allows data to be searched, found, and used as an advance over the usual approach of a relational database. The semantic data on our portal has the nature of a Giant Global Graph (GGG) that can be easily merged with related data and searched globally via a SPARQL Protocol and RDF Query Language (SPARQL) that makes global searches for data easier than with traditional methods. Our Semantic Web Portal requires that the data be understood by a computer and hence defined by an ontology (vocabulary). This ontology is used by the computer in understanding the data. We have created such an ontology for computational chemistry (purl.org/gc) that encapsulates a broad knowledge of the field of computational chemistry. We refer to this ontology as the Gainesville Core. While it is perhaps the first ontology for computational chemistry and is used by our portal, it is only a start of what must be a long multi-partner effort to define computational chemistry. In conjunction with the above efforts we have defined a new potential file standard (Common Standard for eXchange – CSX for computational chemistry data). This CSX file is the precursor of data in the Resource Description Framework (RDF) form that the semantic web requires. Our portal translates CSX files (as well as other computational chemistry data files) into RDF files that are part of the graph database that the semantic web employs. We propose a CSX file as a convenient way to encapsulate computational chemistry data.

  9. Improving resolution in microscopic holography by computationally fusing multiple, obliquely-illuminated object waves in the Fourier domain

    Energy Technology Data Exchange (ETDEWEB)

    Price, Jeffery R [ORNL; Bingham, Philip R [ORNL; Thomas, Clarence E [ORNL

    2007-01-01

    We present a computational method to increase the effective numerical aperture of a holographic microscopy system operating in air. Our optical system employs a reflection Mach-Zender architecture and computational reconstruction of the full complex (phase and amplitude) wavefront. Based on fundamental diffraction principles, different angles of incident illumination result in different diffracted orders of the object wave being imaged; we record, store, and computationally recombine these object waves to expand the spatial frequency response. Experimental results demonstrate an improvement in the effective numerical aperture of our system from 0.59 to 0.78. c 2006 Optical Society of America OCIS codes: 090.1760, 090.2880

  10. GRID2D/3D: A computer program for generating grid systems in complex-shaped two- and three-dimensional spatial domains. Part 1: Theory and method

    Science.gov (United States)

    Shih, T. I.-P.; Bailey, R. T.; Nguyen, H. L.; Roelke, R. J.

    1990-01-01

    An efficient computer program, called GRID2D/3D was developed to generate single and composite grid systems within geometrically complex two- and three-dimensional (2- and 3-D) spatial domains that can deform with time. GRID2D/3D generates single grid systems by using algebraic grid generation methods based on transfinite interpolation in which the distribution of grid points within the spatial domain is controlled by stretching functions. All single grid systems generated by GRID2D/3D can have grid lines that are continuous and differentiable everywhere up to the second-order. Also, grid lines can intersect boundaries of the spatial domain orthogonally. GRID2D/3D generates composite grid systems by patching together two or more single grid systems. The patching can be discontinuous or continuous. For continuous composite grid systems, the grid lines are continuous and differentiable everywhere up to the second-order except at interfaces where different single grid systems meet. At interfaces where different single grid systems meet, the grid lines are only differentiable up to the first-order. For 2-D spatial domains, the boundary curves are described by using either cubic or tension spline interpolation. For 3-D spatial domains, the boundary surfaces are described by using either linear Coon's interpolation, bi-hyperbolic spline interpolation, or a new technique referred to as 3-D bi-directional Hermite interpolation. Since grid systems generated by algebraic methods can have grid lines that overlap one another, GRID2D/3D contains a graphics package for evaluating the grid systems generated. With the graphics package, the user can generate grid systems in an interactive manner with the grid generation part of GRID2D/3D. GRID2D/3D is written in FORTRAN 77 and can be run on any IBM PC, XT, or AT compatible computer. In order to use GRID2D/3D on workstations or mainframe computers, some minor modifications must be made in the graphics part of the program; no

  11. Advances in Time-Domain Electromagnetic Simulation Capabilities Through the Use of Overset Grids and Massively Parallel Computing

    Science.gov (United States)

    1997-03-01

    COMPUTING Douglas C. Blake Captain, USAF Approved: THOMAA . BUYER, Research Advisor Major, USAF Department of Aeronautical and Astronautical Engineering...N um erical Investigation ........................................................ 5-8 5.3.1 M ethodology

  12. Use of personal computers for translation and publication of an anesthesia textbook.

    Science.gov (United States)

    Kayama, H; Iwase, Y; Kinefuchi, Y; Suwa, K

    1995-03-01

    We used personal computers extensively for translating and publishing in Japanese an anesthesia textbook originally written in English. The procedure included optical character recognition, scanning of figures, use of computer translation, use of electronic mail and computer type-setting. While these have individually been done previously, this is process of any medical textbook published in Japanese. The advantages of combining these technologies are good exchange of information among individual authors/translators, rapid translation process, preliminary visualization of the final product, and overall high quality of the published book.

  13. An assessment of mercury in estuarine sediment and tissue in Southern New Jersey using public domain data

    Science.gov (United States)

    Ng, Kara; Szabo, Zoltan; Reilly, Pamela A.; Barringer, Julia; Smalling, Kelly L.

    2016-01-01

    Mercury (Hg) is considered a contaminant of global concern for coastal environments due to its toxicity, widespread occurrence in sediment, and bioaccumulation in tissue. Coastal New Jersey, USA, is characterized by shallow bays and wetlands that provide critical habitat for wildlife but share space with expanding urban landscapes. This study was designed as an assessment of the magnitude and distribution of Hg in coastal New Jersey sediments and critical species using publicly available data to highlight potential data gaps. Mercury concentrations in estuary sediments can exceed 2 μg/g and correlate with concentrations of other metals. Based on existing data, the concentrations of Hg in mussels in southern New Jersey are comparable to those observed in other urbanized Atlantic Coast estuaries. Lack of methylmercury data for sediments, other media, and tissues are data gaps needing to be filled for a clearer understanding of the impacts of Hg inputs to the ecosystem.

  14. Computational manufacturing

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents a general framework for computational manufacturing. The methodology of computational manufacturing aims at integrating computational geometry, machining principle, sensor information fusion, optimization, computational intelligence and virtual prototyping to solve problems of the modeling, reasoning, control, planning and scheduling of manufacturing processes and systems. There are three typical problems in computational manufacturing, i.e., scheduling (time-domain), geometric reasoning (space-domain) and decision- making (interaction between time-domain and space-domain). Some theoretical fundamentals of computational manufacturing are also discussed.

  15. Domains and domain loss

    DEFF Research Database (Denmark)

    Haberland, Hartmut

    2005-01-01

    The domain concept, originally suggested by Schmidt-Rohr in the 1930’s (as credited in Fishman’s writings in the 1970s), was an attempt to sort out different areas of language use in multilingual societies, which are relevant for language choice. In Fishman’s version, domains were considered...... not described in terms of domains, and recent research e.g. about the multilingual communities in the Danish-German border area seems to confirm this....

  16. The Latin American Giant Observatory: a successful collaboration in Latin America based on Cosmic Rays and computer science domains

    CERN Document Server

    Asorey, H; Núñez, L A; Rodríguez-Pascual, M; Montero, A J Rubio; Suarez-Durán, M; Torres-Niño, L A

    2016-01-01

    In this work the strategy of the Latin American Giant Observatory (LAGO) to build a Latin American collaboration is presented. Installing Cosmic Rays detectors settled all around the Continent, from Mexico to the Antarctica, this collaboration is forming a community that embraces both high energy physicist and computer scientists. This is so because the data that are measured must be analytical processed and due to the fact that \\textit{a priori} and \\textit{a posteriori} simulations representing the effects of the radiation must be performed. To perform the calculi, customized codes have been implemented by the collaboration. With regard to the huge amount of data emerging from this network of sensors and from the computational simulations performed in a diversity of computing architectures and e-infrastructures, an effort is being carried out to catalog and preserve a vast amount of data produced by the water-Cherenkov Detector network and the complete LAGO simulation workflow that characterize each site. M...

  17. The GLEaMviz computational tool, a publicly available software to explore realistic epidemic spreading scenarios at the global scale

    Directory of Open Access Journals (Sweden)

    Quaggiotto Marco

    2011-02-01

    Full Text Available Abstract Background Computational models play an increasingly important role in the assessment and control of public health crises, as demonstrated during the 2009 H1N1 influenza pandemic. Much research has been done in recent years in the development of sophisticated data-driven models for realistic computer-based simulations of infectious disease spreading. However, only a few computational tools are presently available for assessing scenarios, predicting epidemic evolutions, and managing health emergencies that can benefit a broad audience of users including policy makers and health institutions. Results We present "GLEaMviz", a publicly available software system that simulates the spread of emerging human-to-human infectious diseases across the world. The GLEaMviz tool comprises three components: the client application, the proxy middleware, and the simulation engine. The latter two components constitute the GLEaMviz server. The simulation engine leverages on the Global Epidemic and Mobility (GLEaM framework, a stochastic computational scheme that integrates worldwide high-resolution demographic and mobility data to simulate disease spread on the global scale. The GLEaMviz design aims at maximizing flexibility in defining the disease compartmental model and configuring the simulation scenario; it allows the user to set a variety of parameters including: compartment-specific features, transition values, and environmental effects. The output is a dynamic map and a corresponding set of charts that quantitatively describe the geo-temporal evolution of the disease. The software is designed as a client-server system. The multi-platform client, which can be installed on the user's local machine, is used to set up simulations that will be executed on the server, thus avoiding specific requirements for large computational capabilities on the user side. Conclusions The user-friendly graphical interface of the GLEaMviz tool, along with its high level

  18. Automatic detection of lung nodules in computed tomography images: training and validation of algorithms using public research databases

    Science.gov (United States)

    Camarlinghi, Niccolò

    2013-09-01

    Lung cancer is one of the main public health issues in developed countries. Lung cancer typically manifests itself as non-calcified pulmonary nodules that can be detected reading lung Computed Tomography (CT) images. To assist radiologists in reading images, researchers started, a decade ago, the development of Computer Aided Detection (CAD) methods capable of detecting lung nodules. In this work, a CAD composed of two CAD subprocedures is presented: , devoted to the identification of parenchymal nodules, and , devoted to the identification of the nodules attached to the pleura surface. Both CADs are an upgrade of two methods previously presented as Voxel Based Neural Approach CAD . The novelty of this paper consists in the massive training using the public research Lung International Database Consortium (LIDC) database and on the implementation of new features for classification with respect to the original VBNA method. Finally, the proposed CAD is blindly validated on the ANODE09 dataset. The result of the validation is a score of 0.393, which corresponds to the average sensitivity of the CAD computed at seven predefined false positive rates: 1/8, 1/4, 1/2, 1, 2, 4, and 8 FP/CT.

  19. Computational modeling of the Fc αRI receptor binding in the Fc α domain of the human antibody IgA: Normal Modes Analysis (NMA) study

    Science.gov (United States)

    Jayasinghe, Manori; Posgai, Monica; Tonddast-Navaei, Sam; Ibrahim, George; Stan, George; Herr, Andrew; George Stan Group Collaboration; Herr's Group Team

    2014-03-01

    Fc αRI receptor binding in the Fc α domain of the antibody IgA triggers immune effector responses such as phagocytosis and antibody-dependent cell-mediated cytotoxicity in eukaryotic cells. Fc α is a dimer of heavy chains of the IgA antibody and each Fc α heavy chain which consisted of two immunoglobulin constant domains, CH2 and CH3, can bind one Fc αRI molecule at the CH2-CH3 interface forming a 2:1 stoichiometry. Experimental evidences confirmed that Fc αRI binding to the Fc α CH2-CH3 junction altered the kinetics of HAA lectin binding at the distant IgA1 hinge. Our focus in this research was to understand the conformational changes and the network of residues which co-ordinate the receptor binding dynamics of the Fc α dimer complex. Structure-based elastic network modeling was used to compute normal modes of distinct Fc α configurations. Asymmetric and un-liganded Fc α configurations were obtained from the high resolution crystal structure of Fc α-Fc αRI 2:1 symmetric complex of PDB ID 1OW0. Our findings confirmed that Fc αRI binding, either in asymmetric or symmetric complex with Fc α, propagated long-range conformational changes across the Fc domains, potentially also impacting the distant IgA1 hinge.

  20. A locally parametrized reduced order model for the linear frequency domain approach to time-accurate computational fluid dynamics

    DEFF Research Database (Denmark)

    Zimmermann, Ralf

    2014-01-01

    ) in an offline stage. The claimed trajectory is obtained locally by interpolating the given local subspaces considered as sample points in the Grassmann manifold. It is shown that the manifold interpolation technique is subject to certain restrictions. Moreover, it turns out that the application of computing...... under a sinusoidal pitching motion....

  1. Enabling Water Quality Management Decision Support and Public Outreach Using Cloud-Computing Services

    Science.gov (United States)

    Sun, A. Y.; Scanlon, B. R.; Uhlman, K.

    2013-12-01

    Watershed management is a participatory process that requires collaboration among multiple groups of people. Environmental decision support systems (EDSS) have long been used to support such co-management and co-learning processes in watershed management. However, implementing and maintaining EDSS in-house can be a significant burden to many water agencies because of budget, technical, and policy constraints. Basing on experiences from several web-GIS environmental management projects in Texas, we showcase how cloud-computing services can help shift the design and hosting of EDSS from the traditional client-server-based platforms to be simple clients of cloud-computing services.

  2. Effectiveness of the Computer and Internet Literacy Project in Public High Schools of Tarlac Province, Philippines

    Science.gov (United States)

    Lorenzo, Arnold R.

    2016-01-01

    Evaluation is important to gauge the strengths, weaknesses and effectiveness of any activity. This study evaluated the iSchools Project implemented in the Public High Schools of Tarlac Province, Philippines by the Commission on Information and Communications Technology (CICT) in partnership with the selected State Universities and Colleges. Using…

  3. An Overview of Public Access Computer Software Management Tools for Libraries

    Science.gov (United States)

    Wayne, Richard

    2004-01-01

    An IT decision maker gives an overview of public access PC software that's useful in controlling session length and scheduling, Internet access, print output, security, and the latest headaches: spyware and adware. In this article, the author describes a representative sample of software tools in several important categories such as setup…

  4. Factors Influencing the Adoption of and Business Case for Cloud Computing in the Public Sector

    NARCIS (Netherlands)

    Kuiper, E.; Van Dam, F.; Reiter, A.; Janssen, M.F.W.H.A.

    2014-01-01

    Cloud adoption in the public sector is taking off slowly, which is perceived as a problem. Models of factors influencing cloud adoption are derived for better understanding using literature and results obtained via desk research and surveys by the Cloud for Europe project. We conclude that several f

  5. Factors Influencing the Adoption of and Business Case for Cloud Computing in the Public Sector

    NARCIS (Netherlands)

    Kuiper, E.; Van Dam, F.; Reiter, A.; Janssen, M.F.W.H.A.

    2014-01-01

    Cloud adoption in the public sector is taking off slowly, which is perceived as a problem. Models of factors influencing cloud adoption are derived for better understanding using literature and results obtained via desk research and surveys by the Cloud for Europe project. We conclude that several

  6. Structural models of zebrafish (Danio rerio NOD1 and NOD2 NACHT domains suggest differential ATP binding orientations: insights from computational modeling, docking and molecular dynamics simulations.

    Directory of Open Access Journals (Sweden)

    Jitendra Maharana

    Full Text Available Nucleotide-binding oligomerization domain-containing protein 1 (NOD1 and NOD2 are cytosolic pattern recognition receptors playing pivotal roles in innate immune signaling. NOD1 and NOD2 recognize bacterial peptidoglycan derivatives iE-DAP and MDP, respectively and undergoes conformational alternation and ATP-dependent self-oligomerization of NACHT domain followed by downstream signaling. Lack of structural adequacy of NACHT domain confines our understanding about the NOD-mediated signaling mechanism. Here, we predicted the structure of NACHT domain of both NOD1 and NOD2 from model organism zebrafish (Danio rerio using computational methods. Our study highlighted the differential ATP binding modes in NOD1 and NOD2. In NOD1, γ-phosphate of ATP faced toward the central nucleotide binding cavity like NLRC4, whereas in NOD2 the cavity was occupied by adenine moiety. The conserved 'Lysine' at Walker A formed hydrogen bonds (H-bonds and Aspartic acid (Walker B formed electrostatic interaction with ATP. At Sensor 1, Arg328 of NOD1 exhibited an H-bond with ATP, whereas corresponding Arg404 of NOD2 did not. 'Proline' of GxP motif (Pro386 of NOD1 and Pro464 of NOD2 interacted with adenine moiety and His511 at Sensor 2 of NOD1 interacted with γ-phosphate group of ATP. In contrast, His579 of NOD2 interacted with the adenine moiety having a relatively inverted orientation. Our findings are well supplemented with the molecular interaction of ATP with NLRC4, and consistent with mutagenesis data reported for human, which indicates evolutionary shared NOD signaling mechanism. Together, this study provides novel insights into ATP binding mechanism, and highlights the differential ATP binding modes in zebrafish NOD1 and NOD2.

  7. Computer-Based Video Instruction to Teach Students with Intellectual Disabilities to Use Public Bus Transportation

    Science.gov (United States)

    Mechling, Linda; O'Brien, Eileen

    2010-01-01

    This study investigated the effectiveness of computer-based video instruction (CBVI) to teach three young adults with moderate intellectual disabilities to push a "request to stop bus signal" and exit a city bus in response to target landmarks. A multiple probe design across three students and one bus route was used to evaluate effectiveness of…

  8. Computers in the Curriculum of Secondary Schools. Practitioner MiniPaper 8. SCR Publication 106.

    Science.gov (United States)

    Morrison, Arnold

    The two studies reported in this document were commissioned by the Scottish Education Department. The first is a review of research on the effectiveness of computers as resources for learning and teaching. It reports the general findings from a large number of studies concerned with the achievement and attitudes of students and then considers some…

  9. Implicit upwind schemes for computational fluid dynamics. Solution by domain decomposition; Etude des schemas decentres implicites pour le calcul numerique en mecanique des fluides. Resolution par decomposition de domaine

    Energy Technology Data Exchange (ETDEWEB)

    Clerc, S

    1998-07-01

    In this work, the numerical simulation of fluid dynamics equations is addressed. Implicit upwind schemes of finite volume type are used for this purpose. The first part of the dissertation deals with the improvement of the computational precision in unfavourable situations. A non-conservative treatment of some source terms is studied in order to correct some shortcomings of the usual operator-splitting method. Besides, finite volume schemes based on Godunov's approach are unsuited to compute low Mach number flows. A modification of the up-winding by preconditioning is introduced to correct this defect. The second part deals with the solution of steady-state problems arising from an implicit discretization of the equations. A well-posed linearized boundary value problem is formulated. We prove the convergence of a domain decomposition algorithm of Schwartz type for this problem. This algorithm is implemented either directly, or in a Schur complement framework. Finally, another approach is proposed, which consists in decomposing the non-linear steady state problem. (author)

  10. Current limitations of SNP data from the public domain for studies of complex disorders: a test for ten candidate genes for obesity and osteoporosis

    Directory of Open Access Journals (Sweden)

    Xiao Peng

    2004-02-01

    Full Text Available Abstract Background Public SNP databases are frequently used to choose SNPs for candidate genes in the association and linkage studies of complex disorders. However, their utility for such studies of diseases with ethnic-dependent background has never been evaluated. Results To estimate the accuracy and completeness of SNP public databases, we analyzed the allele frequencies of 41 SNPs in 10 candidate genes for obesity and/or osteoporosis in a large American-Caucasian sample (1,873 individuals from 405 nuclear families by PCR-invader assay. We compared our results with those from the databases and other published studies. Of the 41 SNPs, 8 were monomorphic in our sample. Twelve were reported for the first time for Caucasians and the other 29 SNPs in our sample essentially confirmed the respective allele frequencies for Caucasians in the databases and previous studies. The comparison of our data with other ethnic groups showed significant differentiation between the three major world ethnic groups at some SNPs (Caucasians and Africans differed at 3 of the 18 shared SNPs, and Caucasians and Asians differed at 13 of the 22 shared SNPs. This genetic differentiation may have an important implication for studying the well-known ethnic differences in the prevalence of obesity and osteoporosis, and complex disorders in general. Conclusion A comparative analysis of the SNP data of the candidate genes obtained in the present study, as well as those retrieved from the public domain, suggests that the databases may currently have serious limitations for studying complex disorders with an ethnic-dependent background due to the incomplete and uneven representation of the candidate SNPs in the databases for the major ethnic groups. This conclusion attests to the imperative necessity of large-scale and accurate characterization of these SNPs in different ethnic groups.

  11. Establishing the Public Sphere and Abolishing the Private Domain:The Rise of a Doctrine and Its Social Significance in the Spring and Autumn Period

    Institute of Scientific and Technical Information of China (English)

    Liu Zehua

    2006-01-01

    The dominant views regarding the concepts of "the public"(gong)and"the private"(si) took shape in the Spring and Autumn period and matured in the succeeding years of the Warring States period.This paper is an attempt to trace both the growth of the vocabulary containing "gong"and "si"and the development of philosophical views regarding issues that center on the relation between the individual and the larger social/communal/political body,of which that individual is a member;it also touches on issues related to the proper handling of public afrairs and the relation between state,sovereign,and the individual.The era is often characterized as"The Contention of the Hundred Schools of Thought,"notwithstanding it ended with but one view that is universally accepted by thinkers of diverse persuasion,namely,si is the source of all social evil and.therefore,should be condemned.This is the doctrine known as ligong miesi(abolishing si so gong may be established),which contributed to the orthodox for that era and the millennium to come.By extolling gong and condemning si.it painted a portrait of the Pair as two irreconcilable nomas or forces in social and political life;it provided a iustification for the then emerging new social arrangement and Ways of distribution of power and resources.and it also led to acute conflicts between the sovereign andthe state,the ruledandthe ruler,the stateandthe subject,as well as the public sphere and the private domain.

  12. The Jupyter/IPython architecture: a unified view of computational research, from interactive exploration to communication and publication.

    Science.gov (United States)

    Ragan-Kelley, M.; Perez, F.; Granger, B.; Kluyver, T.; Ivanov, P.; Frederic, J.; Bussonnier, M.

    2014-12-01

    IPython has provided terminal-based tools for interactive computing in Python since 2001. The notebook document format and multi-process architecture introduced in 2011 have expanded the applicable scope of IPython into teaching, presenting, and sharing computational work, in addition to interactive exploration. The new architecture also allows users to work in any language, with implementations in Python, R, Julia, Haskell, and several other languages. The language agnostic parts of IPython have been renamed to Jupyter, to better capture the notion that a cross-language design can encapsulate commonalities present in computational research regardless of the programming language being used. This architecture offers components like the web-based Notebook interface, that supports rich documents that combine code and computational results with text narratives, mathematics, images, video and any media that a modern browser can display. This interface can be used not only in research, but also for publication and education, as notebooks can be converted to a variety of output formats, including HTML and PDF. Recent developments in the Jupyter project include a multi-user environment for hosting notebooks for a class or research group, a live collaboration notebook via Google Docs, and better support for languages other than Python.

  13. Application of multi-thread computing and domain decomposition to the 3-D neutronics Fem code Cronos

    Energy Technology Data Exchange (ETDEWEB)

    Ragusa, J.C. [CEA Saclay, Direction de l' Energie Nucleaire, Service d' Etudes des Reacteurs et de Modelisations Avancees (DEN/SERMA), 91 - Gif sur Yvette (France)

    2003-07-01

    The purpose of this paper is to present the parallelization of the flux solver and the isotopic depletion module of the code, either using Message Passing Interface (MPI) or OpenMP. Thread parallelism using OpenMP was used to parallelize the mixed dual FEM (finite element method) flux solver MINOS. Investigations regarding the opportunity of mixing parallelism paradigms will be discussed. The isotopic depletion module was parallelized using domain decomposition and MPI. An attempt at using OpenMP was unsuccessful and will be explained. This paper is organized as follows: the first section recalls the different types of parallelism. The mixed dual flux solver and its parallelization are then presented. In the third section, we describe the isotopic depletion solver and its parallelization; and finally conclude with some future perspectives. Parallel applications are mandatory for fine mesh 3-dimensional transport and simplified transport multigroup calculations. The MINOS solver of the FEM neutronics code CRONOS2 was parallelized using the directive based standard OpenMP. An efficiency of 80% (resp. 60%) was achieved with 2 (resp. 4) threads. Parallelization of the isotopic depletion solver was obtained using domain decomposition principles and MPI. Efficiencies greater than 90% were reached. These parallel implementations were tested on a shared memory symmetric multiprocessor (SMP) cluster machine. The OpenMP implementation in the solver MINOS is only the first step towards fully using the SMPs cluster potential with a mixed mode parallelism. Mixed mode parallelism can be achieved by combining message passing interface between clusters with OpenMP implicit parallelism within a cluster.

  14. Computational study of three dimensional viscous flow through a turbine cascade using a multi-domain spectral technique

    Science.gov (United States)

    Renaud, Earl W.; Tan, Choon S.

    1991-01-01

    The three dimensional viscous flow through a planar turbine cascade is numerically simulated by direct solution of the incompressible Navier-Stokes equations. Flow dependence in the spanwise direction is represented by direct expansion in Chebyshev polynomials, while the discretization on planes parallel to the endwalls is accomplished using the spectral element method. Elemental mapping from the physical to the computational space uses an algebraic mapping technique. A fractional time stepping method that consists of an explicit nonlinear convective step, an implicit pressure correction step, and an implicit viscous step is used to advance the Navier-Stokes equations forward in time. Results computed at moderate Reynolds numbers show a three dimensional endwall flow separation, a midspan separation of the blade suction surface boundary layer, and other three-dimensional features such as the presence of a saddle point flow in the endwall region. In addition, the computed skin friction lines are shown to be orthogonal to the surface vorticity lines, demonstrating the accuracy achievable in the present method.

  15. Reflections on Teachers Cooperative Teaching about Computer Public Course%对计算机公共课教师合作教学的思考

    Institute of Scientific and Technical Information of China (English)

    李富芸; 吴淑雷

    2014-01-01

    在计算机公共课2位教师合作教学的教学实践基础上,反思了计算机公共课教师合作教学的利与弊。%Practice in two teachers co-teaching about computer public course,reflects the pros and cons of teachers cooperative teaching about computer public course.

  16. A computational approach identifies two regions of Hepatitis C Virus E1 protein as interacting domains involved in viral fusion process.

    Science.gov (United States)

    Bruni, Roberto; Costantino, Angela; Tritarelli, Elena; Marcantonio, Cinzia; Ciccozzi, Massimo; Rapicetta, Maria; El Sawaf, Gamal; Giuliani, Alessandro; Ciccaglione, Anna Rita

    2009-07-29

    The E1 protein of Hepatitis C Virus (HCV) can be dissected into two distinct hydrophobic regions: a central domain containing an hypothetical fusion peptide (FP), and a C-terminal domain (CT) comprising two segments, a pre-anchor and a trans-membrane (TM) region. In the currently accepted model of the viral fusion process, the FP and the TM regions are considered to be closely juxtaposed in the post-fusion structure and their physical interaction cannot be excluded. In the present study, we took advantage of the natural sequence variability present among HCV strains to test, by purely sequence-based computational tools, the hypothesis that in this virus the fusion process involves the physical interaction of the FP and CT regions of E1. Two computational approaches were applied. The first one is based on the co-evolution paradigm of interacting peptides and consequently on the correlation between the distance matrices generated by the sequence alignment method applied to FP and CT primary structures, respectively. In spite of the relatively low random genetic drift between genotypes, co-evolution analysis of sequences from five HCV genotypes revealed a greater correlation between the FP and CT domains than respect to a control HCV sequence from Core protein, so giving a clear, albeit still inconclusive, support to the physical interaction hypothesis.The second approach relies upon a non-linear signal analysis method widely used in protein science called Recurrence Quantification Analysis (RQA). This method allows for a direct comparison of domains for the presence of common hydrophobicity patterns, on which the physical interaction is based upon. RQA greatly strengthened the reliability of the hypothesis by the scoring of a lot of cross-recurrences between FP and CT peptides hydrophobicity patterning largely outnumbering chance expectations and pointing to putative interaction sites. Intriguingly, mutations in the CT region of E1, reducing the fusion process in

  17. Reference computations of public dose and cancer risk from airborne releases of plutonium. Nuclear safety technical report

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, V.L.

    1993-12-23

    This report presents results of computations of doses and the associated health risks of postulated accidental atmospheric releases from the Rocky Flats Plant (RFP) of one gram of weapons-grade plutonium in a form that is respirable. These computations are intended to be reference computations that can be used to evaluate a variety of accident scenarios by scaling the dose and health risk results presented here according to the amount of plutonium postulated to be released, instead of repeating the computations for each scenario. The MACCS2 code has been used as the basis of these computations. The basis and capabilities of MACCS2 are summarized, the parameters used in the evaluations are discussed, and results are presented for the doses and health risks to the public, both the Maximum Offsite Individual (a maximally exposed individual at or beyond the plant boundaries) and the population within 50 miles of RFP. A number of different weather scenarios are evaluated, including constant weather conditions and observed weather for 1990, 1991, and 1992. The isotopic mix of weapons-grade plutonium will change as it ages, the {sup 241}Pu decaying into {sup 241}Am. The {sup 241}Am reaches a peak concentration after about 72 years. The doses to the bone surface, liver, and whole body will increase slightly but the dose to the lungs will decrease slightly. The overall cancer risk will show almost no change over this period. This change in cancer risk is much smaller than the year-to-year variations in cancer risk due to weather. Finally, x/Q values are also presented for other applications, such as for hazardous chemical releases. These include the x/Q values for the MOI, for a collocated worker at 100 meters downwind of an accident site, and the x/Q value integrated over the population out to 50 miles.

  18. Computer program to plot isotherms in bodies of water. Environmental Sciences Division publication No. 1199

    Energy Technology Data Exchange (ETDEWEB)

    DeAngelis, D.L.

    1978-06-01

    For purposes of graphic display it is convenient to represent temperature versus depth data in bodies of water in the form of isotherms (lines of equal temperature). Because it can be tedious to draw such lines by hand from raw data, a computer code has been devised to plot these lines automatically. The procedure assumes that the temperature can be linearly interpolated between the points at which measurements are taken. Details of the code are explained by means of examples. With minor changes, the program can be used to plot isoclines of other environmental parameters.

  19. Computer program to plot isotherms in bodies of water. Environmental Sciences Division publication No. 1199

    Energy Technology Data Exchange (ETDEWEB)

    DeAngelis, D.L.

    1978-06-01

    For purposes of graphic display it is convenient to represent temperature versus depth data in bodies of water in the form of isotherms (lines of equal temperature). Because it can be tedious to draw such lines by hand from raw data, a computer code has been devised to plot these lines automatically. The procedure assumes that the temperature can be linearly interpolated between the points at which measurements are taken. Details of the code are explained by means of examples. With minor changes, the program can be used to plot isoclines of other environmental parameters.

  20. Selected Publications in Image Understanding and Computer Vision from 1974 to 1983

    Science.gov (United States)

    1985-04-18

    graylevel image processing, in (261], 135-147. * 327. C. Guerra , Reflections on local computations, in [261], 221-229. D.3. Software, etc. 368. D...system" for reconstruction of mechanical object from projections, in [4], 491-496. 843,, B. Cernuschi- Frias , D. B. Cooper, and R. M. Bolle, Estimation of...K. R. Sloan, Jr., Analysis of "dot product" shape descriptions, T-PAMI 4, 1982, 87-90. 1012. C. Guerra and G. G. Pieroni, A graph-theoretic method

  1. Virtual Space Exploration: Let's Use Web-Based Computer Game Technology to Boost IYA 2009 Public Interest

    Science.gov (United States)

    Hussey, K.; Doronila, P.; Kulikov, A.; Lane, K.; Upchurch, P.; Howard, J.; Harvey, S.; Woodmansee, L.

    2008-09-01

    With the recent releases of both Google's "Sky" and Microsoft's "WorldWide Telescope" and the large and increasing popularity of video games, the time is now for using these tools, and those crafted at NASA's Jet Propulsion Laboratory, to engage the public in astronomy like never before. This presentation will use "Cassini at Saturn Interactive Explorer " (CASSIE) to demonstrate the power of web-based video-game engine technology in providing the public a "first-person" look at space exploration. The concept of virtual space exploration is to allow the public to "see" objects in space as if they were either riding aboard or "flying" next to an ESA/NASA spacecraft. Using this technology, people are able to immediately "look" in any direction from their virtual location in space and "zoom-in" at will. Users can position themselves near Saturn's moons and observe the Cassini Spacecraft's "encounters" as they happened. Whenever real data for their "view" exists it is incorporated into the scene. Where data is missing, a high-fidelity simulation of the view is generated to fill in the scene. The observer can also change the time of observation into the past or future. Our approach is to utilize and extend the Unity 3d game development tool, currently in use by the computer gaming industry, along with JPL mission specific telemetry and instrument data to build our virtual explorer. The potential of the application of game technology for the development of educational curricula and public engagement are huge. We believe this technology can revolutionize the way the general public and the planetary science community views ESA/NASA missions and provides an educational context that is attractive to the younger generation. This technology is currently under development and application at JPL to assist our missions in viewing their data, communicating with the public and visualizing future mission plans. Real-time demonstrations of CASSIE and other applications in development

  2. Computing a numerical solution of two dimensional non-linear Schrödinger equation on complexly shaped domains by RBF based differential quadrature method

    Science.gov (United States)

    Golbabai, Ahmad; Nikpour, Ahmad

    2016-10-01

    In this paper, two-dimensional Schrödinger equations are solved by differential quadrature method. Key point in this method is the determination of the weight coefficients for approximation of spatial derivatives. Multiquadric (MQ) radial basis function is applied as test functions to compute these weight coefficients. Unlike traditional DQ methods, which were originally defined on meshes of node points, the RBFDQ method requires no mesh-connectivity information and allows straightforward implementation in an unstructured nodes. Moreover, the calculation of coefficients using MQ function includes a shape parameter c. A new variable shape parameter is introduced and its effect on the accuracy and stability of the method is studied. We perform an analysis for the dispersion error and different internal parameters of the algorithm are studied in order to examine the behavior of this error. Numerical examples show that MQDQ method can efficiently approximate problems in complexly shaped domains.

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  4. ACUTRI a computer code for assessing doses to the general public due to acute tritium releases

    CERN Document Server

    Yokoyama, S; Noguchi, H; Ryufuku, S; Sasaki, T

    2002-01-01

    Tritium, which is used as a fuel of a D-T burning fusion reactor, is the most important radionuclide for the safety assessment of a nuclear fusion experimental reactor such as ITER. Thus, a computer code, ACUTRI, which calculates the radiological impact of tritium released accidentally to the atmosphere, has been developed, aiming to be of use in a discussion of licensing of a fusion experimental reactor and an environmental safety evaluation method in Japan. ACUTRI calculates an individual tritium dose based on transfer models specific to tritium in the environment and ICRP dose models. In this calculation it is also possible to analyze statistically on meteorology in the same way as a conventional dose assessment method according to the meteorological guide of the Nuclear Safety Commission of Japan. A Gaussian plume model is used for calculating the atmospheric dispersion of tritium gas (HT) and/or tritiated water (HTO). The environmental pathway model in ACUTRI considers the following internal exposures: i...

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  7. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  8. BindingDB in 2015: A public database for medicinal chemistry, computational chemistry and systems pharmacology.

    Science.gov (United States)

    Gilson, Michael K; Liu, Tiqing; Baitaluk, Michael; Nicola, George; Hwang, Linda; Chong, Jenny

    2016-01-04

    BindingDB, www.bindingdb.org, is a publicly accessible database of experimental protein-small molecule interaction data. Its collection of over a million data entries derives primarily from scientific articles and, increasingly, US patents. BindingDB provides many ways to browse and search for data of interest, including an advanced search tool, which can cross searches of multiple query types, including text, chemical structure, protein sequence and numerical affinities. The PDB and PubMed provide links to data in BindingDB, and vice versa; and BindingDB provides links to pathway information, the ZINC catalog of available compounds, and other resources. The BindingDB website offers specialized tools that take advantage of its large data collection, including ones to generate hypotheses for the protein targets bound by a bioactive compound, and for the compounds bound by a new protein of known sequence; and virtual compound screening by maximal chemical similarity, binary kernel discrimination, and support vector machine methods. Specialized data sets are also available, such as binding data for hundreds of congeneric series of ligands, drawn from BindingDB and organized for use in validating drug design methods. BindingDB offers several forms of programmatic access, and comes with extensive background material and documentation. Here, we provide the first update of BindingDB since 2007, focusing on new and unique features and highlighting directions of importance to the field as a whole.

  9. ACUTRI: a computer code for assessing doses to the general public due to acute tritium releases

    Energy Technology Data Exchange (ETDEWEB)

    Yokoyama, Sumi; Noguchi, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Ryufuku, Susumu; Sasaki, Toshihisa; Kurosawa, Naohiro [Visible Information Center, Inc., Tokai, Ibaraki (Japan)

    2002-11-01

    Tritium, which is used as a fuel of a D-T burning fusion reactor, is the most important radionuclide for the safety assessment of a nuclear fusion experimental reactor such as ITER. Thus, a computer code, ACUTRI, which calculates the radiological impact of tritium released accidentally to the atmosphere, has been developed, aiming to be of use in a discussion of licensing of a fusion experimental reactor and an environmental safety evaluation method in Japan. ACUTRI calculates an individual tritium dose based on transfer models specific to tritium in the environment and ICRP dose models. In this calculation it is also possible to analyze statistically on meteorology in the same way as a conventional dose assessment method according to the meteorological guide of the Nuclear Safety Commission of Japan. A Gaussian plume model is used for calculating the atmospheric dispersion of tritium gas (HT) and/or tritiated water (HTO). The environmental pathway model in ACUTRI considers the following internal exposures: inhalation from a primary plume (HT and/or HTO) released from the facilities and inhalation from a secondary plume (HTO) reemitted from the ground following deposition of HT and HTO. This report describes an outline of the ACUTRI code, a user guide and the results of test calculation. (author)

  10. Computer Domain Term Automatic Extraction and Hierarchical Structure Building%计算机领域术语的自动获取与层次构建

    Institute of Scientific and Technical Information of China (English)

    林源; 陈志泊; 孙俏

    2011-01-01

    This paper presents a computer domain term automatic extraction method based on roles and statistics.It uses computer book titles from Amazon.com website as corpus, data are preprocessed by words splitting, stop words and special characters filtering.Terms are extracted by a set of rules and frequency statistics and inserted into a word tree from ODP to build the hierarchical structure.Experimental results show high precision and recall of the automatically extracted results compared with manual tagged terms.%设计一种能够自动获取计算机领域术语的方案,提出基于规则与统计相结合的抽取方法,使用亚马逊网站的计算机类图书作为语料库,通过分词、去停止词预处理以及词频统计的方法提取出计算机类领域术语,并插入到由ODP构建的树中,形成计算机领域术语的层次结构.实验结果表明,与人工标注结果相比,使用该方法自动获取的术语有很高的准确率与召回率.

  11. 网络微博中公共领域与私人领域的融合%The Fusion of the Public Domain and Private Sphere in Network Micro Blog

    Institute of Scientific and Technical Information of China (English)

    石良

    2012-01-01

    It is clarified that based on the public domain and the private sphere,the paper chooses the micro blog as a starting point,takes examples from sociological theory such as discursive power,social role,and opinion leaders,the fusion tendency of the public domain-private sector in the network——the public domain personal and private sector public is clarified.And it is analyzed that based on reality,the effect is brought to the society by fusion of the public or private field.%把微博作为切入点,借鉴话语权、社会角色、意见领袖等社会学理论阐释了网络中公共领域和私人领域融合的趋势——公共领域私人化与私人领域公共化。并且立足现实,分析了公私领域融合给社会带来的现实影响。

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  13. The Perceptions of Globalization at a Public Research University Computer Science Graduate Department

    Science.gov (United States)

    Nielsen, Selin Yildiz

    Based on a qualitative methodological approach, this study focuses on the understanding of a phenomenon called globalization in a research university computer science department. The study looks into the participants' perspectives about the department, its dynamics, culture and academic environment as related to globalization. The economic, political, academic and social/cultural aspects of the department are taken into consideration in investigating the influences of globalization. Three questions guide this inquiry: 1) How is the notion of globalization interpreted in this department? 2) How does the perception of globalization influence the department in terms of finances, academics, policies and social life And 3) How are these perceptions influence the selection of students? Globalization and neo-institutional view of legitimacy is used as theoretical lenses to conceptualize responses to these questions. The data include interviews, field notes, official and non-official documents. Interpretations of these data are compared to findings from prior research on the impact of globalization in order to clarify and validate findings. Findings show that there is disagreement in how the notion of globalization is interpreted between the doctoral students and the faculty in the department. This disagreement revealed the attitudes and interpretations of globalization in the light of the policies and procedures related to the department. How the faculty experience globalization is not consistent with the literature in this project. The literature states that globalization is a big part of higher education and it is a phenomenon that causes the changes in the goals and missions of higher education institutions (Knight, 2003, De Witt, 2005). The data revealed that globalization is not the cause for change but more of a consequence of actions that take place in achieving the goals and missions of the department.

  14. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  15. Reference computations of public dose and cancer risk from airborne releases of uranium and Class W plutonium

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, V.L.

    1995-06-06

    This report presents ``reference`` computations that can be used by safety analysts in the evaluations of the consequences of postulated atmospheric releases of radionuclides from the Rocky Flats Environmental Technology Site. These computations deal specifically with doses and health risks to the public. The radionuclides considered are Class W Plutonium, all classes of Enriched Uranium, and all classes of Depleted Uranium. (The other class of plutonium, Y, was treated in an earlier report.) In each case, one gram of the respirable material is assumed to be released at ground leveL both with and without fire. The resulting doses and health risks can be scaled to whatever amount of release is appropriate for a postulated accident being investigated. The report begins with a summary of the organ-specific stochastic risk factors appropriate for alpha radiation, which poses the main health risk of plutonium and uranium. This is followed by a summary of the atmospheric dispersion factors for unfavorable and typical weather conditions for the calculation of consequences to both the Maximum Offsite Individual and the general population within 80 km (50 miles) of the site.

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  18. Replacing the wild type loxP site in BACs from the public domain with lox66 using a lox66 transposon

    Directory of Open Access Journals (Sweden)

    Stennett Naima

    2010-02-01

    Full Text Available Abstract Background Chromatin adjoining the site of integration of a transgene affects expression and renders comparisons of closely related transgenes, such as those derived from a BAC deletion series retrofitted with enhancer-traps, unreliable. Gene targeting to a pre-determined site on the chromosome is likely to alleviate the problem. Findings A general procedure to replace the loxP site located at one end of genomic DNA inserts in BACs with lox66 is described. Truncating insert DNA from the loxP end with a Tn10 transposon carrying a lox66 site simultaneously substitutes the loxP with a lox66 sequence. The replacement occurs with high stringency, and the procedure should be applicable to all BACs in the public domain. Cre recombination of loxP with lox66 or lox71 was found to be as efficient as another loxP site during phage P1 transduction of small plasmids containing those sites. However the end-deletion of insert DNA in BACs using a lox66 transposon occurred at no more than 20% the efficiency observed with a loxP transposon. Differences in the ability of Cre protein available at different stages of the P1 life cycle to recombine identical versus non-identical lox-sites is likely responsible for this discrepancy. A possible mechanism to explain these findings is discussed. Conclusions The loxP/lox66 replacement procedure should allow targeting BACs to a pre-positioned lox71 site in zebrafish chromosomes; a system where homologous recombination-mediated "knock-in" technology is unavailable.

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  20. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  6. PR Educators Stress Computers.

    Science.gov (United States)

    Fleming, Charles A.

    1988-01-01

    Surveys the varied roles computers play in public relations education. Asserts that, because computers are used extensively in the public relations field, students should become acquainted with the varied capabilities of computers and their role in public relations practice. (MM)

  7. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  9. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  10. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  12. COMPUTING

    CERN Document Server

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  14. Domain analysis

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2017-01-01

    The domain-analytic approach to knowledge organization (KO) (and to the broader field of library and information science, LIS) is outlined. The article reviews the discussions and proposals on the definition of domains, and provides an example of a domain-analytic study in the field of art studie....... Varieties of domain analysis as well as criticism and controversies are presented and discussed....

  15. Stress Domains in Si\\(111\\)/a-Si3N4 Nanopixel: Ten-Million-Atom Molecular Dynamics Simulations on Parallel Computers

    Science.gov (United States)

    Omeltchenko, Andrey; Bachlechner, Martina E.; Nakano, Aiichiro; Kalia, Rajiv K.; Vashishta, Priya; Ebbsjö, Ingvar; Madhukar, Anupam; Messina, Paul

    2000-01-01

    Parallel molecular dynamics simulations are performed to determine atomic-level stresses in Si\\(111\\)/Si3N4\\(0001\\) and Si\\(111\\)/a-Si3N4 nanopixels. Compared to the crystalline case, the stresses in amorphous Si3N4 are highly inhomogeneous in the plane of the interface. In silicon below the interface, for a 25 nm square mesa stress domains with triangular symmetry are observed, whereas for a rectangular, 54 nm×33 nm, mesa tensile stress domains \\(~300 Å\\) are separated by Y-shaped compressive domain wall. Maximum stresses in the domains and domain walls are -2 GPa and +2 GPa, respectively.

  16. Stress domains in Si(111)/a-Si3N4 nanopixel: ten-million-atom molecular dynamics simulations on parallel computers

    Science.gov (United States)

    Omeltchenko; Bachlechner; Nakano; Kalia; Vashishta; Ebbsjo; Madhukar; Messina

    2000-01-10

    Parallel molecular dynamics simulations are performed to determine atomic-level stresses in Si(111)/Si(3)N4(0001) and Si(111)/a-Si3N4 nanopixels. Compared to the crystalline case, the stresses in amorphous Si3N4 are highly inhomogeneous in the plane of the interface. In silicon below the interface, for a 25 nm square mesa stress domains with triangular symmetry are observed, whereas for a rectangular, 54 nmx33 nm, mesa tensile stress domains ( approximately 300 A) are separated by Y-shaped compressive domain wall. Maximum stresses in the domains and domain walls are -2 GPa and +2 GPa, respectively.

  17. Quality criteria for electronic publications in medicine.

    Science.gov (United States)

    Schulz, S; Auhuber, T; Schrader, U; Klar, R

    1998-01-01

    This paper defines "electronic publications in medicine (EPM)" as computer based training programs, databases, knowledge-based systems, multimedia applications and electronic books running on standard platforms and available by usual distribution channels. A detailed catalogue of quality criteria as a basis for development and evaluation of EPMs is presented. The necessity to raise the quality level of electronic publications is stressed considering aspects of domain knowledge, software engineering, media development, interface design and didactics.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  19. La apropiación del dominio público y las posibilidades de acceso a los bienes culturales | The appropriation of the public domain and the possibilities of access to cultural goods

    Directory of Open Access Journals (Sweden)

    Joan Ramos Toledano

    2017-06-01

    Full Text Available Resumen: Las normas de propiedad intelectual y copyright prevén un periodo de protección otorgando unos derechos económicos exclusivos y temporales. Pasado un plazo determinado, las obras protegidas entran en lo que se denomina dominio público. Éste suele ser considerado como el momento en el que los bienes culturales pasan a estar bajo el dominio y control de la sociedad en conjunto. El presente trabajo pretende argumentar que, dado nuestro actual sistema económico, en realidad el dominio público funciona más como una posibilidad de negocio para determinadas empresas que como una verdadera opción para que el público pueda acceder a las obras. Abstract: The legislation of continental intellectual property and copyright provide for a period of protection granting exclusive and temporary economic rights. After a certain period, protected works enter into what is called the public domain. This is often considered as the moment in which the cultural goods come under the control and domain of society as a whole. The present paper pretends to argue that, given our current economic system, the public domain actually functions more as a business opportunity for certain companies than as a real option for the public to access artistic and intellectual works.  

  20. Security Architecture of Cloud Computing

    Directory of Open Access Journals (Sweden)

    V.KRISHNA REDDY

    2011-09-01

    Full Text Available The Cloud Computing offers service over internet with dynamically scalable resources. Cloud Computing services provides benefits to the users in terms of cost and ease of use. Cloud Computing services need to address the security during the transmission of sensitive data and critical applications to shared and public cloud environments. The cloud environments are scaling large for data processing and storage needs. Cloud computing environment have various advantages as well as disadvantages on the data security of service consumers. This paper aims to emphasize the main security issues existing in cloud computing environments. The security issues at various levels of cloud computing environment is identified in this paper and categorized based on cloud computing architecture. This paper focuses on the usage of Cloud services and security issues to build these cross-domain Internet-connected collaborations.

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  2. Factors influencing health professions students' use of computers for data analysis at three Ugandan public medical schools: a cross-sectional survey.

    Science.gov (United States)

    Munabi, Ian G; Buwembo, William; Bajunirwe, Francis; Kitara, David Lagoro; Joseph, Ruberwa; Peter, Kawungezi; Obua, Celestino; Quinn, John; Mwaka, Erisa S

    2015-02-25

    Effective utilization of computers and their applications in medical education and research is of paramount importance to students. The objective of this study was to determine the association between owning a computer and use of computers for research data analysis and the other factors influencing health professions students' computer use for data analysis. We conducted a cross sectional study among undergraduate health professions students at three public universities in Uganda using a self-administered questionnaire. The questionnaire was composed of questions on participant demographics, students' participation in research, computer ownership, and use of computers for data analysis. Descriptive and inferential statistics (uni-variable and multi- level logistic regression analysis) were used to analyse data. The level of significance was set at 0.05. Six hundred (600) of 668 questionnaires were completed and returned (response rate 89.8%). A majority of respondents were male (68.8%) and 75.3% reported owning computers. Overall, 63.7% of respondents reported that they had ever done computer based data analysis. The following factors were significant predictors of having ever done computer based data analysis: ownership of a computer (adj. OR 1.80, p = 0.02), recently completed course in statistics (Adj. OR 1.48, p =0.04), and participation in research (Adj. OR 2.64, p computer, participation in research and undertaking courses in research methods influence undergraduate students' use of computers for research data analysis. Students are increasingly participating in research, and thus need to have competencies for the successful conduct of research. Medical training institutions should encourage both curricular and extra-curricular efforts to enhance research capacity in line with the modern theories of adult learning.

  3. Stress Domains in Si(111)/a-Si{sub 3}N{sub 4} Nanopixel: Ten-Million-Atom Molecular Dynamics Simulations on Parallel Computers

    Energy Technology Data Exchange (ETDEWEB)

    Omeltchenko, Andrey; Bachlechner, Martina E.; Nakano, Aiichiro; Kalia, Rajiv K.; Vashishta, Priya; Ebbsjoe, Ingvar; Madhukar, Anupam; Messina, Paul

    2000-01-10

    Parallel molecular dynamics simulations are performed to determine atomic-level stresses in Si(111)/Si {sub 3}N{sub 4}(0001) and Si(111)/a-Si {sub 3}N{sub 4} nanopixels. Compared to the crystalline case, the stresses in amorphous Si{sub 3}N {sub 4} are highly inhomogeneous in the plane of the interface. In silicon below the interface, for a 25 nm square mesa stress domains with triangular symmetry are observed, whereas for a rectangular, 54 nmx33 nm , mesa tensile stress domains ({approx}300 Angstrom) are separated by Y-shaped compressive domain wall. Maximum stresses in the domains and domain walls are -2 GPa and +2 GPa , respectively. (c) 2000 The American Physical Society.

  4. Domains of quality of life: Results of a three-stage Delphi consensus-procedure amongst patients, family of patients, clinicians, scientists and the general public

    NARCIS (Netherlands)

    Pietersma, S.; de Vries, M.; Akker van den, M.E.

    2014-01-01

    Purpose Our key objective is to identify the core domains of health-related quality of life (QoL). Health-related QoL utility scales are commonly used in economic evaluations to assess the effectiveness of health-care interventions. However, health-care interventions are likely to affect QoL in a

  5. 水下声散射一致性时域有限差分法的并行算法%Parallel computation of unified finite-difference time-domain for underwater sound scattering

    Institute of Scientific and Technical Information of China (English)

    冯玉田; 王朔中

    2008-01-01

    In this work, we treat scattering objects, water, surface and bottom in a truly unified manner in a parallel finite-difference time-domain (FDTD) scheme, which is suitable for distributed parallel computing in a message passing interface(MPI) programming environment. The algorithm is implemented on a cluster-based high performance computer system.Parallel computation is performed with different division methods in 2D and 3D situations. Based on analysis of main factorsaffecting the speedup rate and parallel efficiency, data communication is reduced by selecting a suitable scheme of task division.A desirable scheme is recommended, giving a higher speedup rate and better efficiency. The results indicate that the unifiedparallel FDTD algorithm provides a solution to the numerical computation of acoustic scattering.

  6. Opening of energy markets: consequences on the missions of public utility and of security of supplies in the domain of electric power and gas; Ouverture des marches energetiques: consequences sur les missions de service public et de securite d'approvisionnement pour l'electricite et le gaz

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    This conference was jointly organized by the International Energy Agency (IEA) and the French ministry of economy, finances, and industry (general direction of energy and raw materials, DGEMP). It was organized in 6 sessions dealing with: 1 - the public utility in the domain of energy: definition of the public utility missions, experience feedback about liberalized markets, public utility obligation and pricing regulation; 2 - the new US energy policy and the lessons learnt from the California crisis; 3 - the security of electric power supplies: concepts of security of supplies, opinion of operators, security of power supplies versus liberalization and investments; 4 - security of gas supplies: markets liberalization and investments, long-term contracts and security of supplies; 5 - debate: how to integrate the objectives of public utility and of security of supplies in a competing market; 6 - conclusions. This document brings together the available talks and transparencies presented at the conference. (J.S.)

  7. Cloud Computing for Complex Performance Codes.

    Energy Technology Data Exchange (ETDEWEB)

    Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Klein, Brandon Thorin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miner, John Gifford [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  8. 76 FR 67418 - Request for Comments on NIST Special Publication 500-293, US Government Cloud Computing...

    Science.gov (United States)

    2011-11-01

    .... SUPPLEMENTARY INFORMATION: The National Institute of Standards and Technology (NIST) has a technology leadership role in support of a secure and effectively adopted Cloud Computing model \\1\\ to reduce costs and improve services. This role is described ] in the 2011 Federal Cloud Computing Strategy \\2\\ as ``a central...

  9. Expanding the landscape of chromatin modification (CM-related functional domains and genes in human.

    Directory of Open Access Journals (Sweden)

    Shuye Pu

    Full Text Available Chromatin modification (CM plays a key role in regulating transcription, DNA replication, repair and recombination. However, our knowledge of these processes in humans remains very limited. Here we use computational approaches to study proteins and functional domains involved in CM in humans. We analyze the abundance and the pair-wise domain-domain co-occurrences of 25 well-documented CM domains in 5 model organisms: yeast, worm, fly, mouse and human. Results show that domains involved in histone methylation, DNA methylation, and histone variants are remarkably expanded in metazoan, reflecting the increased demand for cell type-specific gene regulation. We find that CM domains tend to co-occur with a limited number of partner domains and are hence not promiscuous. This property is exploited to identify 47 potentially novel CM domains, including 24 DNA-binding domains, whose role in CM has received little attention so far. Lastly, we use a consensus Machine Learning approach to predict 379 novel CM genes (coding for 329 proteins in humans based on domain compositions. Several of these predictions are supported by very recent experimental studies and others are slated for experimental verification. Identification of novel CM genes and domains in humans will aid our understanding of fundamental epigenetic processes that are important for stem cell differentiation and cancer biology. Information on all the candidate CM domains and genes reported here is publicly available.

  10. Bergman kernel on generalized exceptional Hua domain

    Institute of Scientific and Technical Information of China (English)

    YIN; weipng(殷慰萍); ZHAO; zhengang(赵振刚)

    2002-01-01

    We have computed the Bergman kernel functions explicitly for two types of generalized exceptional Hua domains, and also studied the asymptotic behavior of the Bergman kernel function of exceptional Hua domain near boundary points, based on Appell's multivariable hypergeometric function.

  11. Agricultural Internet Public Opinion Monitoring System under Cloud Computing Environment%云环境下的农业网络舆情监测系统研究

    Institute of Scientific and Technical Information of China (English)

    陈涛; 刘世洪

    2015-01-01

    For a large number of agricultural information produced by micro-blog, BBS and online social network media, and the challenges brought by public opinion monitoring, an Agricultural Internet Public Opinion Monitoring System (AIPOMS) under the cloud computing environment was proposed. This paper focused on the model of Internet public opinion monitoring, including public opinion information collection, public opinion analysis and public opinion services. It can mine and analyze large scale collected data, and realize the recognition of sensitive topics, detection and tracking of hot topics, and visualization of analysis results. The system can provide the scientific basis for agricultural departments and decision makers to timely detect hot information, sensitive information and the trend of public opinion, and have great significance in the field of agricultural application.%针对微博、论坛等社会网络媒体产生的大量涉农信息,以及舆论监测所带来的挑战,提出建立云环境下农业网络舆情监测系统。重点描述了舆情监测的模型,包括舆情信息采集、舆情分析和舆情服务三个方面。该系统能够对大规模数据采集数据进行挖掘、分析,实现对舆情敏感话题识别、热点话题发现与追踪,并且分析结果可视化展示。为农业相关部门和决策者及时发现热点信息、敏感信息、舆情趋势分析提供科学依据,在农业应用领域具有重要意义。

  12. Computational analysis of the extracellular domain of the Ca²⁺-sensing receptor: an alternate model for the Ca²⁺ sensing region.

    Science.gov (United States)

    Morrill, Gene A; Kostellow, Adele B; Gupta, Raj K

    2015-03-27

    The extracellular Ca(2+) sensing receptor (CaSR) belongs to Class C G-protein-coupled receptors (GPCRs) which include receptors for amino acids, γ-aminobutyric acid and glutamate neurotransmitters. CaSR has been described as having an extended sequence containing a Ca(2+) binding pocket within an extracellular amino (N)-terminal domain, called a Venus Fly Trap (VFT) module. CaSR is thought to consist of three domains: 1) a Ca(2+-)sensory domain, 2) a region containing 7 transmembrane (TM) helices, and 3) a carboxy (C)-terminal tail. We find that SPOCTOPUS (a combination of hidden Markov models and artificial neural networks) predicts that Homo sapiens CaSR contains two additional TM helices ((190)D - G(210); (262)S-E(282)), with the second TM helix containing a pore-lining region ((265)K - I(280)). This predicts that the putative Ca(2+) sensory domain is within an extracellular loop, N-terminal to the highly conserved heptahelical bundle. This loop contains both the cysteine-rich domain ((537)V - C(598)) and a 14 residue "linker" sequence ((599)I - F(612)) thought to support signal transmission to the heptahelical bundle. Thus domain 1 may contain a 189 residue N-terminal extracellular region followed successively by TM-1, a short intracellular loop, TM-2 and a 329 residue extracellular loop; rather than the proposed 620 residue VFT module based on crystallography of the N-terminal region of mGluR1. Since the topologies of the two proteins differ, the published CaSR VFT model is questionable. CaSR also contains multiple caveolin-binding motifs and cholesterol-binding (CRAC/CARC) domains, facilitating localization to plasma membrane lipid rafts. Ion sensing may involve combination of pore-lining regions from CaSR dimers and CaSR-bound caveolins to form ion channels capable of monitoring ionized Ca(2+) levels. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Predictors of Biased Self-perception in Individuals with High Social Anxiety: The Effect of Self-consciousness in the Private and Public Self Domains.

    Science.gov (United States)

    Nordahl, Henrik; Plummer, Alice; Wells, Adrian

    2017-01-01

    "Biased self-perception," the tendency to perceive one's social performance as more negative than observers do, is characteristic of socially anxious individuals. Self-attention processes are hypothesised to underlie biased self-perception, however, different models emphasise different aspects of self-attention, with attention to the public aspects of the self being prominent. The current study aimed to investigate the relative contribution of two types of dispositional self-attention; public- and private self-consciousness to biased self-perception in a high (n = 48) versus a low (n = 48) social anxiety group undergoing an interaction task. The main finding was that private self-consciousness explained substantial and unique variance in biased negative self-perception in individuals with high social anxiety, while public self-consciousness did not. This relationship was independent of increments in state anxiety. Private self-consciousness appeared to have a specific association with bias related to overestimation of negative social performance rather than underestimation of positive social performance. The implication of this finding is that current treatment models of Social anxiety disorder might include broader aspects of self-focused attention, especially in the context of formulating self-evaluation biases.

  14. Predictors of Biased Self-perception in Individuals with High Social Anxiety: The Effect of Self-consciousness in the Private and Public Self Domains

    Directory of Open Access Journals (Sweden)

    Henrik Nordahl

    2017-07-01

    Full Text Available “Biased self-perception,” the tendency to perceive one’s social performance as more negative than observers do, is characteristic of socially anxious individuals. Self-attention processes are hypothesised to underlie biased self-perception, however, different models emphasise different aspects of self-attention, with attention to the public aspects of the self being prominent. The current study aimed to investigate the relative contribution of two types of dispositional self-attention; public- and private self-consciousness to biased self-perception in a high (n = 48 versus a low (n = 48 social anxiety group undergoing an interaction task. The main finding was that private self-consciousness explained substantial and unique variance in biased negative self-perception in individuals with high social anxiety, while public self-consciousness did not. This relationship was independent of increments in state anxiety. Private self-consciousness appeared to have a specific association with bias related to overestimation of negative social performance rather than underestimation of positive social performance. The implication of this finding is that current treatment models of Social anxiety disorder might include broader aspects of self-focused attention, especially in the context of formulating self-evaluation biases.

  15. Computer modelling in combination with in vitro studies reveals similar binding affinities of Drosophila Crumbs for the PDZ domains of Stardust and DmPar-6.

    Science.gov (United States)

    Kempkens, Ozlem; Médina, Emmanuelle; Fernandez-Ballester, Gregorio; Ozüyaman, Susann; Le Bivic, André; Serrano, Luis; Knust, Elisabeth

    2006-08-01

    Formation of multiprotein complexes is a common theme to pattern a cell, thereby generating spatially and functionally distinct entities at specialised regions. Central components of these complexes are scaffold proteins, which contain several protein-protein interaction domains and provide a platform to recruit a variety of additional components. There is increasing evidence that protein complexes are dynamic structures and that their components can undergo various interactions depending on the cellular context. However, little is known so far about the factors regulating this behaviour. One evolutionarily conserved protein complex, which can be found both in Drosophila and mammalian epithelial cells, is composed of the transmembrane protein Crumbs/Crb3 and the scaffolding proteins Stardust/Pals1 and DPATJ/PATJ, respectively, and localises apically to the zonula adherens. Here we show by in vitro analysis that, similar as in vertebrates, the single PDZ domain of Drosophila DmPar-6 can bind to the four C-terminal amino acids (ERLI) of the transmembrane protein Crumbs. To further evaluate the binding capability of Crumbs to DmPar-6 and the MAGUK protein Stardust, analysis of the PDZ structural database and modelling of the interactions between the C-terminus of Crumbs and the PDZ domains of these two proteins were performed. The results suggest that both PDZ domains bind Crumbs with similar affinities. These data are supported by quantitative yeast two-hybrid interactions. In vivo analysis performed in cell cultures and in the Drosophila embryo show that the cytoplasmic domain of Crumbs can recruit DmPar-6 and DaPKC to the plasma membrane. The data presented here are discussed with respect to possible dynamic interactions between these proteins.

  16. Network Hatred:Obstruction and Decomposition of Order Construction in Virtual Public Domain%网络怨恨:虚拟公共领域秩序建构的梗阻与消解

    Institute of Scientific and Technical Information of China (English)

    2015-01-01

    网络怨恨既是一种情感体验,也是一种生存状态。越来越多的虚拟社会现象表明,网络怨恨已经出现了一定程度上的弥散,虚拟公共领域秩序建构因而面临着无法回避的现实梗阻。网络怨恨导致无直接利益冲突的网络暴力频现,基于网络民粹主义的阶层对抗遭遇激化,同时网络空间出现颠覆性的价值位移。究其根源,网络怨恨与网络世界的平等理念、现代社会的竞争意识和难以克服的现实困境密切相关。需要通过提升虚拟公共领域的公共理性,适度满足网络怨恨的宣泄需求,构建包容性发展的网络监督体系,纠正虚拟公共领域中的价值位移等途径,才能真正消解网络怨恨,实现虚拟公共领域的秩序化运行。%Network hatred is not only an emotional experience , but also a living state.It is suggested by more and more virtual social phenomenon that network hatred has been dispersing in the new public domain and has become virtual social emotions in the keynote.Thus, order construction in the virtual public domain has been ex-posed to practical obstructions which cannot be evaded.As a result of network hatred , network violence , which has no direct conflict of interests , frequently shows up , and network populism-based hierarchical confrontation is triggered , and concurrently , subversive value displacement shows up in network space.Tracing to the source , network hatred is closely related to the equality philosophy of network world , the sense of competition in the modern society and the insurmountable practical difficulties.Only by such channels as improving public reason of the virtual public domain , moderately satisfying the needs for unbosoming network hatred , constructing net-work supervision system under inclusive development and correcting value displacement in the virtual public do -main, can network hatred be truly decomposed and regularized operation of the

  17. L'apprentissage des langues médiatisé par les technologies (ALMT – Étude d'un domaine de recherche émergent à travers les publications de la revue Alsic Technology-mediated language learning: an emergent research domain under study through the review of a French scientific journal's publications

    Directory of Open Access Journals (Sweden)

    Nicolas Guichon

    2012-11-01

    Full Text Available Dans cette étude, il est postulé que l'apprentissage des langues médiatisé par les technologies (ALMT est un domaine de recherche qui s'intéresse au développement et à l'intégration des technologies dans l'enseignement-apprentissage d'une langue. Ce domaine étant émergent, la présente recherche vise tout d'abord à comprendre comment s'est formée la communauté de chercheurs autour de cet objet. Puis, à travers l'analyse critique de 79 articles publiés dans la revue en ligne francophone Alsic entre 1998 et 2010, la présente contribution s'emploie à définir les contours épistémologiques de ce domaine en étudiant les moyens de production de connaissance.In this study, it is postulated that technology mediated language learning is a research domain that focuses on the design and integration of technologies for language learning and teaching. Because this domain is emergent, the present study first aims at understanding how a community of researchers has developed around this object. Then, thanks to the critical analysis of 79 articles published in Alsic, a French-speaking online journal, the present article endeavours to define the epistemological contours of this research domain by studying the means employed to produce knowledge.

  18. 计算机等级考试与高校计算机公共课教学研究%Teaching of Public Course of Computer and Computer Grade Examination in Colleges and Universities

    Institute of Scientific and Technical Information of China (English)

    朱琳; 马蓉; 周钧

    2013-01-01

    In this paper, the relationship between the National Computer Rank Examination and university computer public basic courses teaching and the existing problems are discussed, and how to strengthen the public computer course teaching in the future, the relationship between science and correct processing and computer grade examination provides some opinions. Role in promoting the development of teaching focuses on computer grade examination, as well as computer grade examination negative teaching-oriented, and how to solve the problem of the contradiction between them. And results of the application of computer-aided education, the purpose is to make basic computer teaching and grade examination work together to promote common development, both for educational activities to promote the development of education, in order to create a good educational environment, help students better in school more good, efficient learning.%  本文就目前全国计算机等级考试与高校计算机公共基础课的教学的关系及存在的问题作了一些探讨,并对今后如何加强计算机公共课教学,科学正确的处理与计算机等级考试的关系提出了一些观点。重点阐述计算机等级考试对教学发展的促进作用,以及计算机等级考试对教学的负面导向,以及如何解决它们之间的矛盾问题。应用计算机辅助教育的技术和成果,目的是使计算机基础教学与等级考试共同促进、共同发展,使两者都更好的为教学活动服务,促进教育事业的发展,为创造一个良好的教育环境,有助于学生更好的在学校更加良好、高效率的学习。

  19. Structural Insight for Roles of DR5 Death Domain Mutations on Oligomerization of DR5 Death Domain-FADD Complex in the Death-Inducing Signaling Complex Formation: A Computational Study.

    Science.gov (United States)

    Yang, Hongyi; Song, Yuhua

    2016-04-01

    Death receptor 5 (DR5)-induced apoptosis that prioritizes the death of tumor cells has been proposed as one of the promising cancer therapies. In this process, oligomerized DR5 death domain (DD) binding to Fas-associated death domain (FADD) leads to FADD activating caspase-8, which marks the formation of the death-inducing signaling complex (DISC) that initiates apoptosis. DR5 DD mutations found in cancer cells have been suggested to play an important pathological role, the mechanism through which those mutants prevent the DR5-activated DISC formation is not clear yet. This study sought to provide structural and molecular insight for the roles of four selected DR5 DD mutations (E355K, E367K, K415N, and L363F) in the oligomerization of DR5 DD-FADD complex during the DISC formation. Results from the molecular dynamics simulations show that the simulated mutants induce conformational, dynamical motions and interactions changes in the DR5 DD-FADD tetramer complex, including changes in a protein's backbone flexibility, less exposure of FADD DED's caspase-8 binding site, reduced H-bonding and hydrophobic contacts at the DR5 DD-FADD DD binding, altered distribution of the electrostatic potentials and correlated motions of residues, and reduced binding affinity of DR5 DD binding to FADD. This study provides structural and molecular insight for the influence of DR5 DD mutations on oligomerization of DR5 DD-FADD complex, which is expected to foster understanding of the DR5 DD mutants' resistance mechanism against DR5-activated DISC formation.

  20. Exploration on the Public Computer Room Management Based on PowerShell%基于PowerShell的公共机房管理探索

    Institute of Scientific and Technical Information of China (English)

    郭亮; 郭海智; 谢光

    2015-01-01

    It is an important problem to improve the response ability in the management of public computer room. The applica-tion of the third party management software increases the operating costs and extends the time to solve the problem. Taking the na-tional computer grade examination as an example, PowerShell is used to complete a variety of settings quickly without considering operating system versions, to meet the needs of most public computer room management and save the manpower and material re-sources.%提高公共机房的快速反应能力是机房管理面临的重要问题。第三方管理软件的使用提高了运营成本并延长了解决问题的时间。本文以全国计算机等级考试环境为例,利用PowerShell快速完成各种设置而无需考虑操作系统各种版本,满足公共机房的大部分管理工作需要并节省了人力物力。

  1. GRID2D/3D: A computer program for generating grid systems in complex-shaped two- and three-dimensional spatial domains. Part 2: User's manual and program listing

    Science.gov (United States)

    Bailey, R. T.; Shih, T. I.-P.; Nguyen, H. L.; Roelke, R. J.

    1990-01-01

    An efficient computer program, called GRID2D/3D, was developed to generate single and composite grid systems within geometrically complex two- and three-dimensional (2- and 3-D) spatial domains that can deform with time. GRID2D/3D generates single grid systems by using algebraic grid generation methods based on transfinite interpolation in which the distribution of grid points within the spatial domain is controlled by stretching functions. All single grid systems generated by GRID2D/3D can have grid lines that are continuous and differentiable everywhere up to the second-order. Also, grid lines can intersect boundaries of the spatial domain orthogonally. GRID2D/3D generates composite grid systems by patching together two or more single grid systems. The patching can be discontinuous or continuous. For continuous composite grid systems, the grid lines are continuous and differentiable everywhere up to the second-order except at interfaces where different single grid systems meet. At interfaces where different single grid systems meet, the grid lines are only differentiable up to the first-order. For 2-D spatial domains, the boundary curves are described by using either cubic or tension spline interpolation. For 3-D spatial domains, the boundary surfaces are described by using either linear Coon's interpolation, bi-hyperbolic spline interpolation, or a new technique referred to as 3-D bi-directional Hermite interpolation. Since grid systems generated by algebraic methods can have grid lines that overlap one another, GRID2D/3D contains a graphics package for evaluating the grid systems generated. With the graphics package, the user can generate grid systems in an interactive manner with the grid generation part of GRID2D/3D. GRID2D/3D is written in FORTRAN 77 and can be run on any IBM PC, XT, or AT compatible computer. In order to use GRID2D/3D on workstations or mainframe computers, some minor modifications must be made in the graphics part of the program; no

  2. Visualizing domain wall and reverse domain superconductivity.

    Science.gov (United States)

    Iavarone, M; Moore, S A; Fedor, J; Ciocys, S T; Karapetrov, G; Pearson, J; Novosad, V; Bader, S D

    2014-08-28

    In magnetically coupled, planar ferromagnet-superconductor (F/S) hybrid structures, magnetic domain walls can be used to spatially confine the superconductivity. In contrast to a superconductor in a uniform applied magnetic field, the nucleation of the superconducting order parameter in F/S structures is governed by the inhomogeneous magnetic field distribution. The interplay between the superconductivity localized at the domain walls and far from the walls leads to effects such as re-entrant superconductivity and reverse domain superconductivity with the critical temperature depending upon the location. Here we use scanning tunnelling spectroscopy to directly image the nucleation of superconductivity at the domain wall in F/S structures realized with Co-Pd multilayers and Pb thin films. Our results demonstrate that such F/S structures are attractive model systems that offer the possibility to control the strength and the location of the superconducting nucleus by applying an external magnetic field, potentially useful to guide vortices for computing application.

  3. A COMPUTER DOCKING STUDY OF THE BINDING OF POLYCYCLIC AROMATIC HYDROCARBONS AND THEIR METABOLITES TO THE LIGARD-BINDING DOMAIN OF THE ESTROGEN RECEPTOR

    Science.gov (United States)

    Polycyclic aromatic hydrocarbons (PAHs) are a class of ubiquitous, anthropogenic chemicals found in the environment. In the present study, computational methods are used to evaluate their potential estrogenicity and the contribution chemicals in this class make to environmental e...

  4. A proposal for a computer-based framework of support for public health in the management of biological incidents: the Czech Republic experience.

    Science.gov (United States)

    Bures, Vladimír; Otcenásková, Tereza; Cech, Pavel; Antos, Karel

    2012-11-01

    Biological incidents jeopardising public health require decision-making that consists of one dominant feature: complexity. Therefore, public health decision-makers necessitate appropriate support. Based on the analogy with business intelligence (BI) principles, the contextual analysis of the environment and available data resources, and conceptual modelling within systems and knowledge engineering, this paper proposes a general framework for computer-based decision support in the case of a biological incident. At the outset, the analysis of potential inputs to the framework is conducted and several resources such as demographic information, strategic documents, environmental characteristics, agent descriptors and surveillance systems are considered. Consequently, three prototypes were developed, tested and evaluated by a group of experts. Their selection was based on the overall framework scheme. Subsequently, an ontology prototype linked with an inference engine, multi-agent-based model focusing on the simulation of an environment, and expert-system prototypes were created. All prototypes proved to be utilisable support tools for decision-making in the field of public health. Nevertheless, the research revealed further issues and challenges that might be investigated by both public health focused researchers and practitioners.

  5. 76 FR 12398 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Bureau of the Public Debt (BPD...

    Science.gov (United States)

    2011-03-07

    ... comparison file compiled of records from our expanded Medicare Database (MDB) File system of records in order to support our administration of the prescription drug subsidy program. The MDB File system of... computer systems and provide the response file to us as soon as possible. This agreement covers...

  6. Trends of Mobile Learning in Computing Education from 2006 to 2014: A Systematic Review of Research Publications

    Science.gov (United States)

    Anohah, Ebenezer; Oyelere, Solomon Sunday; Suhonen, Jarkko

    2017-01-01

    The majority of the existing research regarding mobile learning in computing education has primarily focused on studying the effectiveness of, and in some cases reporting about, implemented mobile learning solutions. However, it is equally important to explore development and application perspectives on the integration of mobile learning into…

  7. Trends of Mobile Learning in Computing Education from 2006 to 2014: A Systematic Review of Research Publications

    Science.gov (United States)

    Anohah, Ebenezer; Oyelere, Solomon Sunday; Suhonen, Jarkko

    2017-01-01

    The majority of the existing research regarding mobile learning in computing education has primarily focused on studying the effectiveness of, and in some cases reporting about, implemented mobile learning solutions. However, it is equally important to explore development and application perspectives on the integration of mobile learning into…

  8. Practice for New Assessment Method of Computer Public Courses%计算机公共课新考核方法的实践

    Institute of Scientific and Technical Information of China (English)

    白伟

    2011-01-01

    By analyzing the status of public computer classes and its bottleneck of assessment,some methods of new assessment are explored to further improve the existing appraisal system which can make the assessment both fair and reasonable and a more realistic evaluation of a student's computer capabilities,which can reduce the teachers' workload while stimulate students' learning interests.%通过分析计算机公共课目前授课现状以及面临的考核瓶颈,提出新的考核方法,进一步完善现有的考核评价体系.使该体系能够公平、合理和更真实地反映一个学生的计算机能力.在减轻教师工作量的同时,又能激发学生的学习兴趣.

  9. Tom Tabor, the owner of Tabor Communications, presents Wolfgang von Rüden with the Editors Choice Award of HPCwire, which was awarded to CERN for its commitment to educating the public about high-performance computing.

    CERN Multimedia

    Maximilien Brice

    2006-01-01

    Tom Tabor, the owner of Tabor Communications, presents Wolfgang von Rüden with the Editors Choice Award of HPCwire, which was awarded to CERN for its commitment to educating the public about high-performance computing.

  10. Wavefield extrapolation in pseudodepth domain

    KAUST Repository

    Ma, Xuxin

    2013-02-01

    Wavefields are commonly computed in the Cartesian coordinate frame. Its efficiency is inherently limited due to spatial oversampling in deep layers, where the velocity is high and wavelengths are long. To alleviate this computational waste due to uneven wavelength sampling, we convert the vertical axis of the conventional domain from depth to vertical time or pseudodepth. This creates a nonorthognal Riemannian coordinate system. Isotropic and anisotropic wavefields can be extrapolated in the new coordinate frame with improved efficiency and good consistency with Cartesian domain extrapolation results. Prestack depth migrations are also evaluated based on the wavefield extrapolation in the pseudodepth domain.© 2013 Society of Exploration Geophysicists. All rights reserved.

  11. Computational analysis of siRNA recognition by the Ago2 PAZ domain and identification of the determinants of RNA-induced gene silencing.

    Science.gov (United States)

    Kandeel, Mahmoud; Kitade, Yukio

    2013-01-01

    RNA interference (RNAi) is a highly specialized process of protein-siRNA interaction that results in the regulation of gene expression and cleavage of target mRNA. The PAZ domain of the Argonaute proteins binds to the 3' end of siRNA, and during RNAi the attaching end of the siRNA switches between binding and release from its binding pocket. This biphasic interaction of the 3' end of siRNA with the PAZ domain is essential for RNAi activity; however, it remains unclear whether stronger or weaker binding with PAZ domain will facilitate or hinder the overall RNAi process. Here we report the correlation between the binding of modified siRNA 3' overhang analogues and their in vivo RNAi efficacy. We found that higher RNAi efficacy was associated with the parameters of lower Ki value, lower total intermolecular energy, lower free energy, higher hydrogen bonding, smaller total surface of interaction and fewer van der Waals interactions. Electrostatic interaction was a minor contributor to compounds recognition, underscoring the presence of phosphate groups in the modified analogues. Thus, compounds with lower binding affinity are associated with better gene silencing. Lower binding strength along with the smaller interaction surface, higher hydrogen bonding and fewer van der Waals interactions were among the markers for favorable RNAi activity. Within the measured parameters, the interaction surface, van der Waals interactions and inhibition constant showed a statistically significant correlation with measured RNAi efficacy. The considerations provided in this report will be helpful in the design of new compounds with better gene silencing ability.

  12. Examining the Use of Computers in Writing by Learners of Japanese as a Foreign Language: Analysis of Kanji in the Handwritten and Typed Domains

    Science.gov (United States)

    Dixon, Michael

    2012-01-01

    This study compares second-year Japanese university students' strategies to write kanji by hand with their strategies to produce the kanji characters on a computer, taking into account factors such as accuracy in writing, the amount of kanji used, the complexity of the kanji used, as well as how the characters used compare with the sequence…

  13. Trusted Domain

    DEFF Research Database (Denmark)

    Hjorth, Theis Solberg; Torbensen, Rune

    2012-01-01

    that enables secure end-to-end communication with home automation devices, and it supports device revocations as well as a structure of intersecting sets of nodes for scalability. Devices in the Trusted Domain are registered in a list that is distributed using a robust epidemic protocol optimized...

  14. Domain crossing

    DEFF Research Database (Denmark)

    Schraefel, M. C.; Rouncefield, Mark; Kellogg, Wendy

    2012-01-01

    In CSCW, how much do we need to know about another domain/culture before we observe, intersect and intervene with designs. What optimally would that other culture need to know about us? Is this a “how long is a piece of string” question, or an inquiry where we can consider a variety of contexts a...

  15. Computational modeling of the bHLH domain of the transcription factor TWIST1 and R118C, S144R and K145E mutants

    Directory of Open Access Journals (Sweden)

    Maia Amanda M

    2012-07-01

    Full Text Available Abstract Background Human TWIST1 is a highly conserved member of the regulatory basic helix-loop-helix (bHLH transcription factors. TWIST1 forms homo- or heterodimers with E-box proteins, such as E2A (isoforms E12 and E47, MYOD and HAND2. Haploinsufficiency germ-line mutations of the twist1 gene in humans are the main cause of Saethre-Chotzen syndrome (SCS, which is characterized by limb abnormalities and premature fusion of cranial sutures. Because of the importance of TWIST1 in the regulation of embryonic development and its relationship with SCS, along with the lack of an experimentally solved 3D structure, we performed comparative modeling for the TWIST1 bHLH region arranged into wild-type homodimers and heterodimers with E47. In addition, three mutations that promote DNA binding failure (R118C, S144R and K145E were studied on the TWIST1 monomer. We also explored the behavior of the mutant forms in aqueous solution using molecular dynamics (MD simulations, focusing on the structural changes of the wild-type versus mutant dimers. Results The solvent-accessible surface area of the homodimers was smaller on wild-type dimers, which indicates that the cleft between the monomers remained more open on the mutant homodimers. RMSD and RMSF analyses indicated that mutated dimers presented values that were higher than those for the wild-type dimers. For a more careful investigation, the monomer was subdivided into four regions: basic, helix I, loop and helix II. The basic domain presented a higher flexibility in all of the parameters that were analyzed, and the mutant dimer basic domains presented values that were higher than the wild-type dimers. The essential dynamic analysis also indicated a higher collective motion for the basic domain. Conclusions Our results suggest the mutations studied turned the dimers into more unstable structures with a wider cleft, which may be a reason for the loss of DNA binding capacity observed for in vitro

  16. The acoustics of public squares/places: A comparison between results from a computer simulation program and measurements in situ

    DEFF Research Database (Denmark)

    Paini, Dario; Rindel, Jens Holger; Gade, Anders

    2004-01-01

    In the contest of a PhD thesis, in which the main purpose is to analyse the importance of the public square/place (“agora”) as a meeting point of sound and music, with particular regard to its use for concerts (amplified or not), a first step was done, making comparisons between measurement in situ...... is not completely closed and not completely open, with highly reflecting and partially diffusing vertical surfaces (the facades) and with one totally absorbing surface (the sky). A natural application of these results will be the possibility to detect the best position for a sound source (typically an orchestra...... or a band during, for instance, music summer festivals) and the best position for the audience. A further result could be to propose some acoustic adjustments to achieve better acoustic quality by considering the acoustic parameters which are typically used for concert halls and opera houses....

  17. Effect of Size of the Computational Domain on Spherical Nonlinear Force-Free Modeling of Coronal Magnetic Field Using SDO/HMI Data

    CERN Document Server

    Tadesse, Tilaye; MacNeice, Peter

    2014-01-01

    The solar coronal magnetic field produces solar activity, including extremely energetic solar flares and coronal mass ejections (CMEs). Knowledge of the structure and evolution of the magnetic field of the solar corona is important for investigating and understanding the origins of space weather. Although the coronal field remains difficult to measure directly, there is considerable interest in accurate modeling of magnetic fields in and around sunspot regions on the Sun using photospheric vector magnetograms as boundary data. In this work, we investigate effects of the size of the domain chosen for coronal magnetic field modeling on resulting model solution. We apply spherical Optimization procedure to vector magnetogram data of Helioseismic and Magnetic Imager (HMI) onboard Solar Dynamics Observatory (SDO) with four Active Region observed on 09 March 2012 at 20:55UT. The results imply that quantities like magnetic flux density, electric current density and free magnetic energy density of ARs of interest are...

  18. Mapping Knowledge Domain on Subject Headings of Public Sentiment Research Based on Multi-dimensional Scaling%基于多维尺度分析的舆情研究主题词知识图谱

    Institute of Scientific and Technical Information of China (English)

    孙艳; 田丽梅

    2016-01-01

    为了对舆情的研究现状进行客观梳理,总结研究文献内在的联系和科学结构,文中选取近5年来中国知网收录的“中文核心期刊”和“CSSCI”相关研究文献展开研究。首先,进行前期数据准备,包括准备的步骤与方法及其相关的数学模型;然后,将相异系数矩阵输入到SPSS中进行多维尺度分析并绘制知识图谱;最后,从维度定义和空间分布特点两个方面对知识图谱进行分析。结果表明,当前舆情研究主要集中于4个方向,舆情直接相关研究是重点与热点,媒体相关的研究领域也较活跃,但一些细分的研究方向成果比较分散。%In order to conduct objective comb for the current situation of public sentiment research and summarize the intrinsic links and science structure of researched literatures,it researches on"Chinese Core Journals" and"CSSCI" relevant research literatures included in CNKI in the past five years in this paper. First,preliminary data should be prepared,comprising the steps and methods of preparation and associated mathematical model. Then,the dissimilarity coefficient matrix is input into SPSS software to carry on multi dimensional scaling and draw mapping knowledge domain. Finally,mapping knowledge domain must be analyzed from two aspects of dimension definition and spatial distribution. The results show that the current public sentiment research has focused on four directions,and direct relevant re-search of public sentiment is the focus and hotspot,and media-related field of study is more active,but some results of research direction of segmentation are more dispersed.

  19. Development of Computer Science Disciplines - A Social Network Analysis Approach

    CERN Document Server

    Pham, Manh Cuong; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and workshop proceedings. That results in an imprecise and incomplete analysis of the computer science knowledge. This paper presents an analysis on the computer science knowledge network constructed from all types of publications, aiming at providing a complete view of computer science research. Based on the combination of two important digital libraries (DBLP and CiteSeerX), we study the knowledge network created at journal/conference level using citation linkage, to identify the development of sub-disciplines. We investiga...

  20. Generation of high-fidelity four-photon cluster state and quantum-domain demonstration of one-way quantum computing.

    Science.gov (United States)

    Tokunaga, Yuuki; Kuwashiro, Shin; Yamamoto, Takashi; Koashi, Masato; Imoto, Nobuyuki

    2008-05-30

    We experimentally demonstrate a simple scheme for generating a four-photon entangled cluster state with fidelity over 0.860+/-0.015. We show that the fidelity is high enough to guarantee that the produced state is distinguished from Greenberger-Horne-Zeilinger, W, and Dicke types of genuine four-qubit entanglement. We also demonstrate basic operations of one-way quantum computing using the produced state and show that the output state fidelities surpass classical bounds, which indicates that the entanglement in the produced state essentially contributes to the quantum operation.

  1. Predicting domain-domain interaction based on domain profiles with feature selection and support vector machines

    Directory of Open Access Journals (Sweden)

    Liao Li

    2010-10-01

    Full Text Available Abstract Background Protein-protein interaction (PPI plays essential roles in cellular functions. The cost, time and other limitations associated with the current experimental methods have motivated the development of computational methods for predicting PPIs. As protein interactions generally occur via domains instead of the whole molecules, predicting domain-domain interaction (DDI is an important step toward PPI prediction. Computational methods developed so far have utilized information from various sources at different levels, from primary sequences, to molecular structures, to evolutionary profiles. Results In this paper, we propose a computational method to predict DDI using support vector machines (SVMs, based on domains represented as interaction profile hidden Markov models (ipHMM where interacting residues in domains are explicitly modeled according to the three dimensional structural information available at the Protein Data Bank (PDB. Features about the domains are extracted first as the Fisher scores derived from the ipHMM and then selected using singular value decomposition (SVD. Domain pairs are represented by concatenating their selected feature vectors, and classified by a support vector machine trained on these feature vectors. The method is tested by leave-one-out cross validation experiments with a set of interacting protein pairs adopted from the 3DID database. The prediction accuracy has shown significant improvement as compared to InterPreTS (Interaction Prediction through Tertiary Structure, an existing method for PPI prediction that also uses the sequences and complexes of known 3D structure. Conclusions We show that domain-domain interaction prediction can be significantly enhanced by exploiting information inherent in the domain profiles via feature selection based on Fisher scores, singular value decomposition and supervised learning based on support vector machines. Datasets and source code are freely available on

  2. Defining Domain Language of Graphical User Interfaces

    OpenAIRE

    Baciková, Michaela; PORUBÄN Jaroslav; Lakatos, Dominik

    2013-01-01

    Domain-specific languages are computer (programming, modeling, specification) languages devoted to solving problems in a specific domain. The least examined DSL development phases are analysis and design. Various formal methodologies exist, however domain analysis is still done informally most of the time. There are also methodologies of deriving DSLs from existing ontologies but the presumption is to have an ontology for the specific domain. We propose a solution of a user interface driven d...

  3. Defining Domain Language of Graphical User Interfaces

    OpenAIRE

    Baciková, Michaela; PORUBÄN Jaroslav; Lakatos, Dominik

    2013-01-01

    Domain-specific languages are computer (programming, modeling, specification) languages devoted to solving problems in a specific domain. The least examined DSL development phases are analysis and design. Various formal methodologies exist, however domain analysis is still done informally most of the time. There are also methodologies of deriving DSLs from existing ontologies but the presumption is to have an ontology for the specific domain. We propose a solution of a user interface driven d...

  4. Evaluation to Obtain the Image According to the Spatial Domain Filtering of Various Convolution Kernels in the Multi-Detector Row Computed Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hoo Min [Dept. of Radiologic Technology, Dongnam Health College, Suwon (Korea, Republic of); Yoo, Beong Gyu [Dept. of Radiologic Technology, Wonkwang Health Science College, Iksan (Korea, Republic of); Kweon, Dae Cheol [Dept. of Radiology, Seoul National University, Seoul (Korea, Republic of)

    2008-03-15

    Our objective was to evaluate the image of spatial domain filtering as an alternative to additional image reconstruction using different kernels in MDCT. Derived from thin collimated source images were generated using water phantom and abdomen B10(very smooth), B20(smooth), B30(medium smooth), B40 (medium), B50(medium sharp), B60(sharp), B70(very sharp) and B80(ultra sharp) kernels. MTF and spatial resolution measured with various convolution kernels. Quantitative CT attenuation coefficient and noise measurements provided comparable HU(Hounsfield) units in this respect. CT attenuation coefficient(mean HU) values in the water were values in the water were 1.1{approx}1.8 HU, air(-998{approx}-1000 HU) and noise in the water(5.4{approx}44.8 HU), air(3.6{approx}31.4 HU). In the abdominal fat a CT attenuation coefficient(-2.2{approx}0.8 HU) and noise(10.1{approx}82.4 HU) was measured. In the abdominal was CT attenuation coefficient(53.3{approx}54.3 HU) and noise(10.4{approx}70.7 HU) in the muscle and in the liver parenchyma of CT attenuation coefficient(60.4{approx}62.2 HU) and noise (7.6{approx}63.8 HU) in the liver parenchyma. Image reconstructed with a convolution kernel led to an increase in noise, whereas the results for CT attenuation coefficient were comparable. Image scanned with a high convolution kernel(B80) led to an increase in noise, whereas the results for CT attenuation coefficient were comparable. Image medications of image sharpness and noise eliminate the need for reconstruction using different kernels in the future. Adjusting CT various kernels, which should be adjusted to take into account the kernels of the CT undergoing the examination, may control CT images increase the diagnostic accuracy.

  5. Computational studies on receptor-ligand interactions between novel buffalo (Bubalus bubalis) nucleotide-binding oligomerization domain-containing protein 2 (NOD2) variants and muramyl dipeptide (MDP).

    Science.gov (United States)

    Brahma, Biswajit; Patra, Mahesh Chandra; Mishra, Purusottam; De, Bidhan Chandra; Kumar, Sushil; Maharana, Jitendra; Vats, Ashutosh; Ahlawat, Sonika; Datta, Tirtha Kumar; De, Sachinandan

    2016-04-01

    Nucleotide binding and oligomerization domain 2 (NOD2), a member of intracellular NOD-like receptors (NLRs) family, recognizes the bacterial peptidoglycan, muramyl dipeptide (MDP) and initiates host immune response. The precise ligand recognition mechanism of NOD2 has remained elusive, although studies have suggested leucine rich repeat (LRR) region of NOD2 as the possible binding site of MDP. In this study, we identified multiple transcripts of NOD2 gene in buffalo (buNOD2) and at least five LRR variants (buNOD2-LRRW (wild type), buNOD2-LRRV1-V4) were found to be expressed in buffalo peripheral blood mononuclear cells. The newly identified buNOD2 transcripts were shorter in lengths as a result of exon-skipping and frame-shift mutations. Among the variants, buNOD2-LRRW, V1, and V3 were expressed more frequently in the animals studied. A comparative receptor-ligand interaction study through modeling of variants, docking, and molecular dynamics simulation revealed that the binding affinity of buNOD2-LRRW towards MDP was greater than that of the shorter variants. The absence of a LRR segment in the buNOD2 variants had probably affected their affinity toward MDP. Notwithstanding a high homology among the variants, the amino acid residues that interact with MDP were located on different LRR motifs. The binding free energy calculation revealed that the amino acids Arg850(LRR4) and Glu932(LRR7) of buNOD2-LRRW, Lys810(LRR3) of buNOD2-LRRV1, and Lys830(LRR3) of buNOD2-LRRV3 largely contributed towards MDP recognition. The knowledge of MDP recognition and binding modes on buNOD2 variants could be useful to understand the regulation of NOD-mediated immune response as well as to develop next generation anti-inflammatory compounds.

  6. Computational prediction and experimental characterization of a "size switch type repacking" during the evolution of dengue envelope protein domain III (ED3).

    Science.gov (United States)

    Elahi, Montasir; Islam, Monirul M; Noguchi, Keiichi; Yohda, Masafumi; Toh, Hiroyuki; Kuroda, Yutaka

    2014-03-01

    Dengue viruses (DEN) are classified into four serotypes (DEN1-DEN4) exhibiting high sequence and structural similarities, and infections by multiple serotypes can lead to the deadly dengue hemorrhagic fever. Here, we aim at characterizing the thermodynamic stability of DEN envelope protein domain III (ED3) during its evolution, and we report a structural analysis of DEN4wt ED3 combined with a systematic mutational analysis of residues 310 and 387. Molecular modeling based on our DEN3 and DEN4 ED3 structures indicated that the side-chains of residues 310/387, which are Val(310)/Ile(387) and Met(310)/Leu(387) in DEN3wt and DEN4wt, respectively, could be structurally compensated, and that a "size switch type repacking" might have occurred at these sites during the evolution of DEN into its four serotypes. This was experimentally confirmed by a 10°C and 5°C decrease in the thermal stability of, respectively, DEN3 ED3 variants with Met(310)/Ile(387) and Val(310)/Leu(387), whereas the variant with Met(310)/Leu(387), which contains a double mutation, had the same stability as the wild type DEN3. Namely, the Met310Val mutation should have preceded the Leu387Ile mutation in order to maintain the tight internal packing of ED3 and thus its thermodynamic stability. This view was confirmed by a phylogenetic reconstruction indicating that a common DEN ancestor would have Met(310)/Leu(387), and the intermediate node protein, Val(310)/Leu(387), which then mutated to the Val(310)/Ile(387) pair found in the present DEN3. The hypothesis was further confirmed by the observation that all of the present DEN viruses exhibit only stabilizing amino acid pairs at the 310/387 sites.

  7. Computational evolutionary analysis of the overlapped surface (S and polymerase (P region in hepatitis B virus indicates the spacer domain in P is crucial for survival.

    Directory of Open Access Journals (Sweden)

    Ping Chen

    Full Text Available INTRODUCTION: The Hepatitis B Virus (HBV genome contains four ORFs, S (surface, P (polymerase, C (core and X. S is completely overlapped by P and as a consequence the overlapping region is subject to distinctive evolutionary constraints compared to the remainder of the genome. Specifically, a non-synonymous substitution in one coding frame may produce a synonymous substitution in the alternative frame, suggesting a possible conflict between requirements for diversifying and purifying forces. To examine how these contrasting requirements are balanced within this region, we investigated the relationship amongst positive selection sites, conserved regions, epitopes and elements of protein structure to consider how HBV balances the contrasting evolutionary pressures. METHODOLOGY/RESULTS: 323 HBV genotype D genome sequences were collected and analyzed to identify sites under positive selection and highly conserved regions. Epitopes sequences were retrieved from previously published experimental studies stored in the Immune Epitope Database. Predicted secondary structures were used to investigate the association between structure and conservation. Entropy was used as a measure of conservation and bivariate logistic regression was used to investigate the relationship between positive selection/conserved sites and epitope/secondary structure regions. Our results indicate: (i conservation in S is primarily dictated by α-helix elements in the protein structure, (ii variable residues are mainly located in PreS, the major hydrophilic region (MHR and the C-terminus, (iii epitopes in S, which are directly targeted by the host immune system, are significantly associated with sites under positive selection. CONCLUSIONS: The highly variable spacer domain in P, which corresponds to PreS in S, appears to act as a harbor for the accumulation of mutations that can provide flexibility for conformational changes and responding to immune pressure.

  8. CLOUD COMPUTING SECURITY

    Directory of Open Access Journals (Sweden)

    Ştefan IOVAN

    2016-05-01

    Full Text Available Cloud computing reprentes the software applications offered as a service online, but also the software and hardware components from the data center.In the case of wide offerd services for any type of client, we are dealing with a public cloud. In the other case, in wich a cloud is exclusively available for an organization and is not available to the open public, this is consider a private cloud [1]. There is also a third type, called hibrid in which case an user or an organization might use both services available in the public and private cloud. One of the main challenges of cloud computing are to build the trust and ofer information privacy in every aspect of service offerd by cloud computingle. The variety of existing standards, just like the lack of clarity in sustenability certificationis not a real help in building trust. Also appear some questions marks regarding the efficiency of traditionsecurity means that are applied in the cloud domain. Beside the economic and technology advantages offered by cloud, also are some advantages in security area if the information is migrated to cloud. Shared resources available in cloud includes the survey, use of the "best practices" and technology for advance security level, above all the solutions offered by the majority of medium and small businesses, big companies and even some guvermental organizations [2].

  9. A review of Technologies on Tracking Tibetan Public Opinion Topics

    Directory of Open Access Journals (Sweden)

    Chen Bi-Rong

    2016-01-01

    Full Text Available The target of technologies on tracking Tibetan-based network public opinion topics is to track the development of some known topics. By tracking these topics could help people to grasp the trend of public opinion. This research area becomes more and more important on Natural Language Processing domain and how to apply these state-of-the-art computer science technologies on the study of national language has drawn lots of awareness to related researchers. This article will introduce some features about Tibetan language and its grammar. Some technologies about categorizing Tibetan text and tracking Tibetan-based network public opinion will also be talked about.

  10. DIMA 3.0: Domain Interaction Map.

    Science.gov (United States)

    Luo, Qibin; Pagel, Philipp; Vilne, Baiba; Frishman, Dmitrij

    2011-01-01

    Domain Interaction MAp (DIMA, available at http://webclu.bio.wzw.tum.de/dima) is a database of predicted and known interactions between protein domains. It integrates 5807 structurally known interactions imported from the iPfam and 3did databases and 46,900 domain interactions predicted by four computational methods: domain phylogenetic profiling, domain pair exclusion algorithm correlated mutations and domain interaction prediction in a discriminative way. Additionally predictions are filtered to exclude those domain pairs that are reported as non-interacting by the Negatome database. The DIMA Web site allows to calculate domain interaction networks either for a domain of interest or for entire organisms, and to explore them interactively using the Flash-based Cytoscape Web software.

  11. The block cipher NSABC (public domain)

    CERN Document Server

    Nguyenova-Stepanikova, Alice

    2011-01-01

    We introduce NSABC/w -- Nice-Structured Algebraic Block Cipher using w-bit word arithmetic, a 4w-bit analogous of Skipjack [NSA98] with 5w-bit key. The Skipjack's internal 4-round Feistel structure is replaced with a w-bit, 2-round cascade of a binary operation (x,z)\\mapsto(x\\boxdot z)\\lll(w/2) that permutes a text word x under control of a key word z. The operation \\boxdot, similarly to the multiplication in IDEA [LM91, LMM91], bases on an algebraic group over w-bit words, so it is also capable of decrypting by means of the inverse element of z in the group. The cipher utilizes a secret 4w-bit tweak -- an easily changeable parameter with unique value for each block encrypted under the same key [LRW02] -- that is derived from the block index and an additional 4w -bit key. A software implementation for w=64 takes circa 9 clock cycles per byte on x86-64 processors.

  12. 22 CFR 120.11 - Public domain.

    Science.gov (United States)

    2010-04-01

    .... Government and specific access and dissemination controls protecting information resulting from the research... published information; (3) Through second class mailing privileges granted by the U.S. Government; (4) At.... Government access and dissemination controls. University research will not be considered fundamental...

  13. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  14. Public health situation awareness: toward a semantic approach

    Science.gov (United States)

    Mirhaji, Parsa; Richesson, Rachel L.; Turley, James P.; Zhang, Jiajie; Smith, Jack W.

    2004-04-01

    We propose a knowledge-based public health situation awareness system. The basis for this system is an explicit representation of public health situation awareness concepts and their interrelationships. This representation is based upon the users" (public health decision makers) cognitive model of the world, and optimized towards the efficacy of performance and relevance to the public health situation awareness processes and tasks. In our approach, explicit domain knowledge is the foundation for interpretation of public health data, as apposed to conventional systems where the statistical methods are the essence of the processes. Objectives: To develop a prototype knowledge-based system for public health situation awareness and to demonstrate the utility of knowledge intensive approaches in integration of heterogeneous information, eliminating the effects of incomplete and poor quality surveillance data, uncertainty in syndrome and aberration detection and visualization of complex information structures in public health surveillance settings, particularly in the context of bioterrorism (BT) preparedness. The system employs the Resource Definition Framework (RDF) and additional layers of more expressive languages to explicate the knowledge of domain experts into machine interpretable and computable problem-solving modules that can then guide users and computer systems in sifting through the most "relevant" data for syndrome and outbreak detection and investigation of root cause of the event. The Center for Biosecurity and Public Health Informatics Research is developing a prototype knowledge-based system around influenza, which has complex natural disease patterns, many public health implications, and is a potential agent for bioterrorism. The preliminary data from this effort may demonstrate superior performance in information integration, syndrome and aberration detection, information access through information visualization, and cross-domain investigation of the

  15. Integrated computational approach to the analysis of NMR relaxation in proteins: application to ps-ns main chain 15N-1H and global dynamics of the Rho GTPase binding domain of plexin-B1.

    Science.gov (United States)

    Zerbetto, Mirco; Buck, Matthias; Meirovitch, Eva; Polimeno, Antonino

    2011-01-20

    An integrated computational methodology for interpreting NMR spin relaxation in proteins has been developed. It combines a two-body coupled-rotator stochastic model with a hydrodynamics-based approach for protein diffusion, together with molecular dynamics based calculations for the evaluation of the coupling potential of mean force. The method is applied to ¹⁵N relaxation of N-H bonds in the Rho GTPase binding (RBD) domain of plexin-B1, which exhibits intricate internal mobility. Bond vector dynamics are characterized by a rhombic local ordering tensor, S, with principal values S₀² and S₂², and an axial local diffusion tensor, D₂, with principal values D(2,||) and D(2,⊥). For α-helices and β-sheets we find that S₀² ~ -0.5 (strong local ordering), -1.2 computational approach for treating NMR relaxation in proteins by combining stochastic modeling and molecular dynamics. The approach developed provides new insights by its application to a protein that experiences complex dynamics.

  16. The FELICIA bulletin board system and the IRBIS anonymous FTP server: Computer security information sources for the DOE community. CIAC-2302

    Energy Technology Data Exchange (ETDEWEB)

    Orvis, W.J.

    1993-11-03

    The Computer Incident Advisory Capability (CIAC) operates two information servers for the DOE community, FELICIA (formerly FELIX) and IRBIS. FELICIA is a computer Bulletin Board System (BBS) that can be accessed by telephone with a modem. IRBIS is an anonymous ftp server that can be accessed on the Internet. Both of these servers contain all of the publicly available CIAC, CERT, NIST, and DDN bulletins, virus descriptions, the VIRUS-L moderated virus bulletin board, copies of public domain and shareware virus- detection/protection software, and copies of useful public domain and shareware utility programs. This guide describes how to connect these systems and obtain files from them.

  17. A protein domain interaction interface database: InterPare

    Directory of Open Access Journals (Sweden)

    Lee Jungsul

    2005-08-01

    Full Text Available Abstract Background Most proteins function by interacting with other molecules. Their interaction interfaces are highly conserved throughout evolution to avoid undesirable interactions that lead to fatal disorders in cells. Rational drug discovery includes computational methods to identify the interaction sites of lead compounds to the target molecules. Identifying and classifying protein interaction interfaces on a large scale can help researchers discover drug targets more efficiently. Description We introduce a large-scale protein domain interaction interface database called InterPare http://interpare.net. It contains both inter-chain (between chains interfaces and intra-chain (within chain interfaces. InterPare uses three methods to detect interfaces: 1 the geometric distance method for checking the distance between atoms that belong to different domains, 2 Accessible Surface Area (ASA, a method for detecting the buried region of a protein that is detached from a solvent when forming multimers or complexes, and 3 the Voronoi diagram, a computational geometry method that uses a mathematical definition of interface regions. InterPare includes visualization tools to display protein interior, surface, and interaction interfaces. It also provides statistics such as the amino acid propensities of queried protein according to its interior, surface, and interface region. The atom coordinates that belong to interface, surface, and interior regions can be downloaded from the website. Conclusion InterPare is an open and public database server for protein interaction interface information. It contains the large-scale interface data for proteins whose 3D-structures are known. As of November 2004, there were 10,583 (Geometric distance, 10,431 (ASA, and 11,010 (Voronoi diagram entries in the Protein Data Bank (PDB containing interfaces, according to the above three methods. In the case of the geometric distance method, there are 31,620 inter-chain domain-domain

  18. The role of interdisciplinary research team in the impact of health apps in health and computer science publications: a systematic review.

    Science.gov (United States)

    Molina Recio, Guillermo; García-Hernández, Laura; Molina Luque, Rafael; Salas-Morera, Lorenzo

    2016-07-15

    Several studies have estimated the potential economic and social impact of the mHealth development. Considering the latest study by Institute for Healthcare Informatics, more than 165.000 apps of health and medicine are offered including all the stores from different platforms. Thus, the global mHealth market was an estimated $10.5 billion in 2014 and is expected to grow 33.5 percent annually between 2015 and 2020s. In fact, apps of Health have become the third-fastest growing category, only after games and utilities. This study aims to identify, study and evaluate the role of interdisciplinary research teams in the development of articles and applications in the field of mHealth. It also aims to evaluate the impact that the development of mHealth has had on the health and computer science field, through the study of publications in specific databases for each area which have been published until nowadays. Interdisciplinary nature is strongly connected to the scientific quality of the journal in which the work is published. This way, there are significant differences in those works that are made up by an interdisciplinary research team because of they achieve to publish in journals with higher quartiles. There are already studies that warn of methodological deficits in some studies in mHealth, low accuracy and no reproducibility. Studies of low precision and poor reproducibility, coupled with the low evidence, provide low degrees of recommendation of the interventions targeted and therefore low applicability. From the evidence of this study, working in interdisciplinary groups from different areas greatly enhances the quality of research work as well as the quality of the publications derived from its results.

  19. IaaS公有云平台调度模型研究%Research on IaaS public cloud computing platform scheduling model

    Institute of Scientific and Technical Information of China (English)

    岳冬利; 刘海涛; 孙傲冰

    2011-01-01

    A service model for laaS public cloud is created, and based on the waiting-line theory, the optimization analysis ofthe service mode, the queue length and the configuration of scheduling server is made. A scheduling model based on demand-vectors is created to filter available host machines according to the match of demand and available resource. Ifhost machines which meet the demands can' t be found firstly, the scheduling algorithm can combine with the virtual machine motivation to reallocate physical resources to guarantee the maximal available rate and usability of the whole platform. The feasibility of the algorithm is verified on our own IaaS public cloud computing platform.%抽象出IaaS公有云平台的服务模型,基于排队论对平台服务模式、队列长度、调度服务器设置等进行了优化分析.在此基础上提出一种基于IaaS平台需求向量的调度模型,根据需求与可用资源的匹配度从平台管理的物理机集合中筛选出可用的宿主机,若一次性无法找到符合要求的宿主机,平台调度算法结合虚拟机迁移操作,对物理资源进行重新分配,在实现平台资源利用率最大化的同时,保障了平台的可用性.将该算法应用在自主研发的云计算平台上,实验结果验证了该算法的可行性.

  20. .Gov Domains API

    Data.gov (United States)

    General Services Administration — This dataset offers the list of all .gov domains, including state, local, and tribal .gov domains. It does not include .mil domains, or other federal domains outside...

  1. Granular Computing

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    The basic ideas and principles of granular computing (GrC) have been studied explicitly or implicitly in many fields in isolation. With the recent renewed and fast growing interest, it is time to extract the commonality from a diversity of fields and to study systematically and formally the domain independent principles of granular computing in a unified model. A framework of granular computing can be established by applying its own principles. We examine such a framework from two perspectives,granular computing as structured thinking and structured problem solving. From the philosophical perspective or the conceptual level,granular computing focuses on structured thinking based on multiple levels of granularity. The implementation of such a philosophy in the application level deals with structured problem solving.

  2. Mathematics, Computers in Mathematics, and Gender: Public Perceptions in Context (Matemáticas, Ordenadores en Matemáticas y Género: Percepciones Públicas en Contexto

    Directory of Open Access Journals (Sweden)

    Helen J. Forgasz

    2011-09-01

    Full Text Available In Australia, national tests of mathematics achievement continue showing small but consistent gender differences in favor of boys. Societal views and pressures are among the factors invoked to explain such subtle but persistent differences. In this paper we focus directly on the beliefs of the general public about students’ learning of mathematics and the role played by computers, and then we compare the findings with data previously gathered from students. Although many considered it inappropriate to differentiate between boys and girls, gender based stereotyping was still evident. En Australia, los test nacionales del logro matemático continúan mostrando pequeñas pero consistentes diferencias de género en favor de los chicos. Las presiones y visiones sociales están entre los factores invocados para explicar tales diferencias sutiles pero persistentes. En este trabajo nos centramos directamente en las creencias del público en general acerca del aprendizaje matemático de los estudiantes y del papel desempeñado por los ordenadores, y después comparamos las conclusiones con datos previamente obtenidos de los estudiantes. Aunque muchos consideran inapropiado diferenciar entre niños y niñas, todavía son evidentes estereotipos basados en el género.

  3. Nonlinear Evolution of Ferroelectric Domains

    Institute of Scientific and Technical Information of China (English)

    WeiLU; Dai-NingFANG; 等

    1997-01-01

    The nonlinear evolution of ferroelectric domains is investigated in the paper and amodel is proposed which can be applied to numerical computation.Numerical results show that the model can accurately predict some nonlinear behavior and consist with those experimental results.

  4. 批判与再筑:公私二元性别规范的省思——以兰西·弗雷泽公共领域观为视角%Critique and Re -building. Reflections on the Public-Private Duality of Gender Norms--- From the Perspective of the Public Domain Concept of Lancey Fraser

    Institute of Scientific and Technical Information of China (English)

    丁慧

    2012-01-01

    公共领域和私人领域的二元划分,作为哈贝马斯的理想图景,发挥其重要的理论建构作用,在学界引起广泛影响,同时也引起了持久的争论,并且受到多方的质疑和挑战。其中,女性主义对公共领域概念进行了颠覆性的改造。女性主义对此问题的关注,主要集中在女性与社会关系、女性的社会地位等方面,以社会性别的分析视角反思单一的总括性公共领域构想,女性作为弱势群体的作用长期被忽略了。公共领域既是形成话语意见的场所,也是形成社会身份的场所。由于在历史发展过程中,女性始终处于一种无语的状态,女性的话题一直被遮蔽在权力关系之外,即使自由主义时期,家务劳动的价值没有得到确认和展开。性别偏置的问题没有得到根本性改观,基于公共领域意志所形成的性别规范和性别制度必然呈现男权主义的特征。因此,公私二元的理论修正,对于性别平等这一价值和原则而言,是重要的理论支点。%The dual division of public and private spheres, as Habermas's ideal picture, played an important role in the theoretical construction and caused widespread impacts in the academic community and a long debate, which were questioned and challenged by the parties. Among them, the concept of feminism car- ried out disruptive transformation to the concept of the public domain. Attention of feminism focuses mainly on women and social relations and women's social status. Reflecting on a single, overarching public sphere concept by gender analysis perspective we find the role of women as vulnerable groups have long been ignored. The public sphere is not only the place to form the words and views, but also the place to form the social identity. In the course of historical development women are always in a state of loss of speech, and the topic of women has been obscured in power relations~ even if in the liberal period, the val- ue of

  5. Dynamics of domain coverage of the protein sequence universe

    Science.gov (United States)

    2012-01-01

    Background The currently known protein sequence space consists of millions of sequences in public databases and is rapidly expanding. Assigning sequences to families leads to a better understanding of protein function and the nature of the protein universe. However, a large portion of the current protein space remains unassigned and is referred to as its “dark matter”. Results Here we suggest that true size of “dark matter” is much larger than stated by current definitions. We propose an approach to reducing the size of “dark matter” by identifying and subtracting regions in protein sequences that are not likely to contain any domain. Conclusions Recent improvements in computational domain modeling result in a decrease, albeit slowly, in the relative size of “dark matter”; however, its absolute size increases substantially with the growth of sequence data. PMID:23157439

  6. Psychological Analysis of Public Library Readers in the Environment of Computer Network Resources%计算机网络资源环境下公共图书馆读者心理分析

    Institute of Scientific and Technical Information of China (English)

    梁佳

    2012-01-01

    Public libraries as a nonprofit cultural educational institution,is open for the public,all members of society who use the library resources,library services object.The Public Library of the nature of its readers with a wide range of social and mass characteristics.Understanding of computer network resources and the environment,public library readers psychology,master of readers of different ages tend to provide a basis for public library services.Public Library audience composition and reading motivation and purpose,read the psychological differences.Readers 'psychology,and the corresponding collection of literature data and network information resources services to meet readers' needs to improve the collection of documents and network resource utilization,of great significance for readers of different age groups.The computer network resources and the environment,public library readers psychological to make analysis,to further explore the computer network resources and the environment,how public libraries should better carry out the reader service work.%公共图书馆作为公益性文化教育机构,是面向社会公众开放的,凡是具有利用图书馆资源条件的一切社会成员,都是图书馆服务对象。公共图书馆的性质决定了它的读者具有广泛的社会性和群众性特点。了解计算机网络资源环境下公共图书馆的读者心理,掌握不同年龄读者的阅读倾向,能为公共图书馆读者服务工作提供依据。

  7. Publicity and public relations

    Science.gov (United States)

    Fosha, Charles E.

    1990-01-01

    This paper addresses approaches to using publicity and public relations to meet the goals of the NASA Space Grant College. Methods universities and colleges can use to publicize space activities are presented.

  8. Concept Convergence in Empirical Domains

    Science.gov (United States)

    Ontañón, Santiago; Plaza, Enric

    How to achieve shared meaning is a significant issue when more than one intelligent agent is involved in the same domain. We define the task of concept convergence, by which intelligent agents can achieve a shared, agreed-upon meaning of a concept (restricted to empirical domains). For this purpose we present a framework that, integrating computational argumentation and inductive concept learning, allows a pair of agents to (1) learn a concept in an empirical domain, (2) argue about the concept's meaning, and (3) reach a shared agreed-upon concept definition. We apply this framework to marine sponges, a biological domain where the actual definitions of concepts such as orders, families and species are currently open to discussion. An experimental evaluation on marine sponges shows that concept convergence is achieved, within a reasonable number of interchanged arguments, and reaching short and accurate definitions (with respect to precision and recall).

  9. An Evaluation of the Pedestrian Classification in a Multi-Domain Multi-Modality Setup

    Directory of Open Access Journals (Sweden)

    Alina Miron

    2015-06-01

    Full Text Available The objective of this article is to study the problem of pedestrian classification across different light spectrum domains (visible and far-infrared (FIR and modalities (intensity, depth and motion. In recent years, there has been a number of approaches for classifying and detecting pedestrians in both FIR and visible images, but the methods are difficult to compare, because either the datasets are not publicly available or they do not offer a comparison between the two domains. Our two primary contributions are the following: (1 we propose a public dataset, named RIFIR , containing both FIR and visible images collected in an urban environment from a moving vehicle during daytime; and (2 we compare the state-of-the-art features in a multi-modality setup: intensity, depth and flow, in far-infrared over visible domains. The experiments show that features families, intensity self-similarity (ISS, local binary patterns (LBP, local gradient patterns (LGP and histogram of oriented gradients (HOG, computed from FIR and visible domains are highly complementary, but their relative performance varies across different modalities. In our experiments, the FIR domain has proven superior to the visible one for the task of pedestrian classification, but the overall best results are obtained by a multi-domain multi-modality multi-feature fusion.

  10. Design of Logistics Public Information Platform Based on Cloud Computing Architecture%云计算架构下的物流公共信息平台设计探讨

    Institute of Scientific and Technical Information of China (English)

    李姝宁

    2012-01-01

    为了进一步研究了物流公共信息平台在云计算之下工作的具体特征,本文以云计算技术架构的整体作为起点进行分析,从而构建了物流公共信息平台的云架构并详述了其工作原理,最后通过前文分析展望了物流公共信息平台未来的发展趋势.%In order to further study the specific characteristics of logistics public information platform under the cloud computing, taking the overall cloud computing architecture as a starting point, this paper constructs the clouding architecture of logistics public information platform and describes its working principle, finally looks for the development trend of logistics public information platform in the future.

  11. A Human-Centred Tangible approach to learning Computational Thinking

    Directory of Open Access Journals (Sweden)

    Tommaso Turchi

    2016-08-01

    Full Text Available Computational Thinking has recently become a focus of many teaching and research domains; it encapsulates those thinking skills integral to solving complex problems using a computer, thus being widely applicable in our society. It is influencing research across many disciplines and also coming into the limelight of education, mostly thanks to public initiatives such as the Hour of Code. In this paper we present our arguments for promoting Computational Thinking in education through the Human-centred paradigm of Tangible End-User Development, namely by exploiting objects whose interactions with the physical environment are mapped to digital actions performed on the system.

  12. Protein domain boundary prediction by combining support vector machine and domain guess by size algorithm

    Institute of Scientific and Technical Information of China (English)

    Dong Qiwen; Wang Xiaolong; Lin Lei

    2007-01-01

    Successful prediction of protein domain boundaries provides valuable information not only for the computational structure prediction of multi-domain proteins but also for the experimental structure determination. A novel method for domain boundary prediction has been presented, which combines the support vector machine with domain guess by size algorithm. Since the evolutional information of multiple domains can be detected by position specific score matrix, the support vector machine method is trained and tested using the values of position specific score matrix generated by PSI-BLAST. The candidate domain boundaries are selected from the output of support vector machine, and are then inputted to domain guess by size algorithm to give the final results of domain boundary prediction. The experimental results show that the combined method outperforms the individual method of both support vector machine and domain guess by size.

  13. Computer Teaching Reform of Public Security Institutions Based on "Competency-based"%基于“能力本位”的公安院校计算机教学改革初探

    Institute of Scientific and Technical Information of China (English)

    赵薇; 刘振华

    2013-01-01

    近年来,社会的信息化进程加速,计算机在更多的领域发挥了它的重要作用,公安部门也不例外.伴随着公安部门信息化水平的提升,对公安人员的计算机操作水平有了新的要求,只有掌握了较高的计算机技术,才能够胜任信息化条件下的公安工作.同时,也对公安院校的计算机课程的教学提出了新的要求.本文根据现代警务特点和警察职业所需,在计算机教学中引入“能力本位”的教学理念,为培养现代警务急需的计算机应用型人才提供了新思路,创建了新局面.%In recent years, the informatization process of society is accelerating, and the computer has played an important role in more fields, so public security departments are no exception. Accompanied by the improvement of informatization level of public security departments, it has new requirement for the computer operating level of the public security personnel. Only to master the higher computer technology, can the public security personnel do well the work under conditions of informatization. Meanwhile, it proposes new requirement to the computer teaching in police colleges. The paper introduced the "competency—based" teaching philosophy in computer teaching according to the characteristics of modern policing and police occupations requirement, providing a new idea for the training of applied computer talents that the modern police needed, and creating a new situation.

  14. Integrating natural language processing and web GIS for interactive knowledge domain visualization

    Science.gov (United States)

    Du, Fangming

    Recent years have seen a powerful shift towards data-rich environments throughout society. This has extended to a change in how the artifacts and products of scientific knowledge production can be analyzed and understood. Bottom-up approaches are on the rise that combine access to huge amounts of academic publications with advanced computer graphics and data processing tools, including natural language processing. Knowledge domain visualization is one of those multi-technology approaches, with its aim of turning domain-specific human knowledge into highly visual representations in order to better understand the structure and evolution of domain knowledge. For example, network visualizations built from co-author relations contained in academic publications can provide insight on how scholars collaborate with each other in one or multiple domains, and visualizations built from the text content of articles can help us understand the topical structure of knowledge domains. These knowledge domain visualizations need to support interactive viewing and exploration by users. Such spatialization efforts are increasingly looking to geography and GIS as a source of metaphors and practical technology solutions, even when non-georeferenced information is managed, analyzed, and visualized. When it comes to deploying spatialized representations online, web mapping and web GIS can provide practical technology solutions for interactive viewing of knowledge domain visualizations, from panning and zooming to the overlay of additional information. This thesis presents a novel combination of advanced natural language processing - in the form of topic modeling - with dimensionality reduction through self-organizing maps and the deployment of web mapping/GIS technology towards intuitive, GIS-like, exploration of a knowledge domain visualization. A complete workflow is proposed and implemented that processes any corpus of input text documents into a map form and leverages a web

  15. Prediction of Solution Properties of Flexible-Chain Polymers: A Computer Simulation Undergraduate Experiment

    Science.gov (United States)

    de la Torre, Jose Garcia; Cifre, Jose G. Hernandez; Martinez, M. Carmen Lopez

    2008-01-01

    This paper describes a computational exercise at undergraduate level that demonstrates the employment of Monte Carlo simulation to study the conformational statistics of flexible polymer chains, and to predict solution properties. Three simple chain models, including excluded volume interactions, have been implemented in a public-domain computer…

  16. Coping with Computer Viruses: General Discussion and Review of Symantec Anti-Virus for the Macintosh.

    Science.gov (United States)

    Primich, Tracy

    1992-01-01

    Discusses computer viruses that attack the Macintosh and describes Symantec AntiVirus for Macintosh (SAM), a commercial program designed to detect and eliminate viruses; sample screen displays are included. SAM is recommended for use in library settings as well as two public domain virus protection programs. (four references) (MES)

  17. Ubiquitin domain proteins in disease

    DEFF Research Database (Denmark)

    Klausen, Louise Kjær; Schulze, Andrea; Seeger, Michael

    2007-01-01

    The human genome encodes several ubiquitin-like (UBL) domain proteins (UDPs). Members of this protein family are involved in a variety of cellular functions and many are connected to the ubiquitin proteasome system, an essential pathway for protein degradation in eukaryotic cells. Despite their s...... and cancer. Publication history: Republished from Current BioData's Targeted Proteins database (TPdb; http://www.targetedproteinsdb.com).......The human genome encodes several ubiquitin-like (UBL) domain proteins (UDPs). Members of this protein family are involved in a variety of cellular functions and many are connected to the ubiquitin proteasome system, an essential pathway for protein degradation in eukaryotic cells. Despite...

  18. Domain similarity based orthology detection.

    Science.gov (United States)

    Bitard-Feildel, Tristan; Kemena, Carsten; Greenwood, Jenny M; Bornberg-Bauer, Erich

    2015-05-13

    Orthologous protein detection software mostly uses pairwise comparisons of amino-acid sequences to assert whether two proteins are orthologous or not. Accordingly, when the number of sequences for comparison increases, the number of comparisons to compute grows in a quadratic order. A current challenge of bioinformatic research, especially when taking into account the increasing number of sequenced organisms available, is to make this ever-growing number of comparisons computationally feasible in a reasonable amount of time. We propose to speed up the detection of orthologous proteins by using strings of domains to characterize the proteins. We present two new protein similarity measures, a cosine and a maximal weight matching score based on domain content similarity, and new software, named porthoDom. The qualities of the cosine and the maximal weight matching similarity measures are compared against curated datasets. The measures show that domain content similarities are able to correctly group proteins into their families. Accordingly, the cosine similarity measure is used inside porthoDom, the wrapper developed for proteinortho. porthoDom makes use of domain content similarity measures to group proteins together before searching for orthologs. By using domains instead of amino acid sequences, the reduction of the search space decreases the computational complexity of an all-against-all sequence comparison. We demonstrate that representing and comparing proteins as strings of discrete domains, i.e. as a concatenation of their unique identifiers, allows a drastic simplification of search space. porthoDom has the advantage of speeding up orthology detection while maintaining a degree of accuracy similar to proteinortho. The implementation of porthoDom is released using python and C++ languages and is available under the GNU GPL licence 3 at http://www.bornberglab.org/pages/porthoda .

  19. The Bergman Kernels on Generalized Exceptional Hua Domains

    Institute of Scientific and Technical Information of China (English)

    殷慰萍; 赵振刚

    2001-01-01

    @@Yin Weiping introduce four types of Hua domain which are built on four types of Cartan domain and the Bergman kernels on these four types of Hua domain can be computed in explicit formulas[1]. In this paper, two types of domains defined by (10), (11) (see below) are introduced which are built on two exceptional Cartan domains. And We compute Bergman Kernels explicitly for these two domains. We also study the asymptotic behavior of the Bergman kernel function near boundary points, drawing on Appell's multivariable hypergeometric function.

  20. Computational Electromagnetics

    Science.gov (United States)

    2011-02-20

    a collaboration between Caltech’s postdoctoral associate N. Albin and OB) have shown that, for a variety of reasons, the first-order...KZK approximation", Nathan Albin , Oscar P. Bruno, Theresa Y. Cheung and Robin O. Cleveland, preprint, (2011) "A Spectral FC Solver for the Compressible...Navier-Stokes Equations in General Domains I: Explicit time-stepping" Nathan Albin and Oscar P. Bruno, To appear in Journal of Computational Physics

  1. 大规模并行时域有限差分法电磁计算研究%Study on Large-Scale Parallel Finite-Difference Time-Domain Method for Electromagnetic Computation

    Institute of Scientific and Technical Information of China (English)

    江树刚; 张玉; 赵勋旺

    2015-01-01

    基于我国超级计算机平台,开展了大规模并行时域有限差分法(Finite-Difference Time-Domain FDTD)的性能和应用研究.在我国首台百万亿次"魔方"超级计算机、具有国产CPU的"神威蓝光"超级计算机和当前排名世界第一的"天河二号"超级计算机上就并行FDTD方法的并行性能进行了测试,并分别突破了10000 CPU核,100000 CPU核和300000 CPU核的并行规模.在不同测试规模下,该算法的并行效率均达到了50%以上,表明了本文并行算法具有良好的可扩展性.通过仿真分析多个微带天线阵的辐射特性和某大型飞机的散射特性,表明本文方法可以在不同架构的超级计算机上对复杂电磁问题进行精确高效电磁仿真.%The study of performance and applications of the large-scale parallel Finite-Difference Time-Domain (FDTD) method is carried out on the China-made supercomputer platforms. The parallel efficiency is tested on Magic Cube supercomputer that ranked the 10th fastest supercomputer in the world in Nov. 2008, Sunway BlueLight MPP supercomputer that is the first large scale parallel supercomputer with China-own-made CPUs, and Tianhe-2 supercomputer that ranked the 1st in the world currently, and the algorithm is implemented on the three supercomputers using maximum number of CPU cores of 10000, 100000 and 300000, respectively. The parallel efficiency reaches up to 50%, which indicates a good scalability of the method in this paper. Multiple microstrip antenna arrays and a large airplane are computed to demonstrate that the parallel FDTD can be applied for accurate and efficient simulation of complicated electromagnetic problems on supercomputer with different architectures.

  2. Computer Vision Syndrome.

    Science.gov (United States)

    Randolph, Susan A

    2017-07-01

    With the increased use of electronic devices with visual displays, computer vision syndrome is becoming a major public health issue. Improving the visual status of workers using computers results in greater productivity in the workplace and improved visual comfort.

  3. A typology of public engagement mechanisms

    NARCIS (Netherlands)

    Rowe, G.; Frewer, L.J.

    2005-01-01

    Imprecise definition of key terms in the "public participation" domain have hindered the conduct of good research and militated against the development and implementation of effective participation practices. In this article, we define key concepts in the domain: public communication, public consult

  4. Public Education, Public Good.

    Science.gov (United States)

    Tomlinson, John

    1986-01-01

    Criticizes policies which would damage or destroy a public education system. Examines the relationship between government-provided education and democracy. Concludes that privatization of public education would emphasize self-interest and selfishness, further jeopardizing the altruism and civic mindedness necessary for the public good. (JDH)

  5. Frequency Domain Image Filtering Using CUDA

    Directory of Open Access Journals (Sweden)

    Muhammad Awais Rajput

    2014-10-01

    Full Text Available In this paper, we investigate the implementation of image filtering in frequency domain using NVIDIA?s CUDA (Compute Unified Device Architecture. In contrast to signal and image filtering in spatial domain which uses convolution operations and hence is more compute-intensive for filters having larger spatial extent, the frequency domain filtering uses FFT (Fast Fourier Transform which is much faster and significantly reduces the computational complexity of the filtering. We implement the frequency domain filtering on CPU and GPU respectively and analyze the speed-up obtained from the CUDA?s parallel processing paradigm. In order to demonstrate the efficiency of frequency domain filtering on CUDA, we implement three frequency domain filters, i.e., Butterworth, low-pass and Gaussian for processing different sizes of images on CPU and GPU respectively and perform the GPU vs. CPU benchmarks. The results presented in this paper show that the frequency domain filtering with CUDA achieves significant speed-up over the CPU processing in frequency domain with the same level of (output image quality on both the processing architectures

  6. Interaction Design for Public Spaces

    DEFF Research Database (Denmark)

    Kortbek, Karen Johanne

    2008-01-01

    In this abstract I describe the doctorial research project "Interaction Design for Public Spaces". The objective of the project is to explore and design interaction contexts in culture related public spaces such as museums, experience centres and festivals. As a perspective on this domain, I...... will help interaction designers when designing for bodily movement, and communicating and staging interactive content in public spaces....

  7. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  8. 基于区域分解的共轭梯度法的有限元结构分析并行计算%Parallel computing for finite element structural analysis using conjugate gradient method based on domain decomposition

    Institute of Scientific and Technical Information of China (English)

    付朝江; 张武

    2006-01-01

    Parallel finite element method using domain decomposition technique is adapted to a distributed parallel environment of workstation cluster. The algorithm is presented for parallelization of the preconditioned conjugate gradient method based on domain decomposition. Using the developed code, a dam structural analysis problem is solved on workstation cluster and results are given. The parallel performance is analyzed.

  9. Research of public security platform cloud computing architecture based on internet of things%基于物联网的公共安全云计算平台

    Institute of Scientific and Technical Information of China (English)

    白蛟; 全春来; 郭镇

    2011-01-01

    Internet of things is introduced into the field of public safety technology, and the applications of the distributed computing, virtualized storage and cloud computing technology are discussed. To overcome the disadvantages of the existing public security platform, Internet of things five layers the public safety platform architecture is designed, and all levels of features and technology application are described, which provides a new way for the future construction of Internet of things for police. In order to achieve the business data sharing and security, based on this architecture the data supporting platform is proposed based on cloud computing, which supports vir-tualization of data storage and management, and meanwhile offers high performance computing power and storage equipment dynamic expansion ability. Security and computing power of Internet of things are improved.%将物联网技术引入到公共安全领域,重点研究了分布式计算和虚拟化存储及云计算的技术特点和应用,针对目前公共安全平台的不足,设计了5层的物联网公共安全平台架构,为以后警用物联网的建设提供了新的思路,同时结合该架构,提出了一种基于云计算的数据支撑平台,为该公共安全平台提供虚拟化的数据存储和管理,以实现各业务数据的共享和安全,提高了物联网应用的安全和计算能力.

  10. Analyses of domains and domain fusions in human proto-oncogenes

    Directory of Open Access Journals (Sweden)

    Wan Ping

    2009-03-01

    Full Text Available Abstract Background Understanding the constituent domains of oncogenes, their origins and their fusions may shed new light about the initiation and the development of cancers. Results We have developed a computational pipeline for identification of functional domains of human genes, prediction of the origins of these domains and their major fusion events during evolution through integration of existing and new tools of our own. An application of the pipeline to 124 well-characterized human oncogenes has led to the identification of a collection of domains and domain pairs that occur substantially more frequently in oncogenes than in human genes on average. Most of these enriched domains and domain pairs are related to tyrosine kinase activities. In addition, our analyses indicate that a substantial portion of the domain-fusion events of oncogenes took place in metazoans during evolution. Conclusion We expect that the computational pipeline for domain identification, domain origin and domain fusion prediction will prove to be useful for studying other groups of genes.

  11. Protein domain prediction

    NARCIS (Netherlands)

    Ingolfsson, Helgi; Yona, Golan

    2008-01-01

    Domains are considered to be the building blocks of protein structures. A protein can contain a single domain or multiple domains, each one typically associated with a specific function. The combination of domains determines the function of the protein, its subcellular localization and the interacti

  12. Membrane binding domains

    OpenAIRE

    Hurley, James H.

    2006-01-01

    Eukaryotic signaling and trafficking proteins are rich in modular domains that bind cell membranes. These binding events are tightly regulated in space and time. The structural, biochemical, and biophysical mechanisms for targeting have been worked out for many families of membrane binding domains. This review takes a comparative view of seven major classes of membrane binding domains, the C1, C2, PH, FYVE, PX, ENTH, and BAR domains. These domains use a combination of specific headgroup inter...

  13. 微课在高校计算机公共课Blending Learning中的应用研究%Micro-Lecture Practicing Research in University Computer Public Course Blending Learning

    Institute of Scientific and Technical Information of China (English)

    陈静; 胡玉娟

    2016-01-01

    本文研究微课在高校计算机公共课 Blending Learning中的应用方法和效果,首先分析Blending Learning 和微课在高校计算机公共课中的特点,然后总结高校计算机公共课Blending Learning中微课的设计和应用方法。基于认知负荷理论设计了“微课学习任务单”并且结合动机模型( ARCS)设计了调查问卷,采用行动研究的方法初步在教学班级实践了3种类型微课的应用。实践调查分析了微课在高校计算机公共课Blending Learning中的学习效果,总结了学生对于微课类型和运用的教学环节的偏向,为在高校计算机公共课中使用微课提供了一定的参考,也为进一步研究提供基础。%This paper aims to study the application method and effect of micro class in the Blending Learning of computer public course in university.First, we analyze the characteristics of blending learning and the micro-lecture in the university com-puter public course,then summarize the teaching model of university computer public course blending learning based on micro-lec-ture.We design Micro-lecture task list based on the cognitive load theory and questionnaire based on the John model of the Keller ARCS and apply it into the three types of micro-lecture in teaching classes.Through the practice investigation and analysis of the learning effect of the learning blending of the university computer public course based on the micro-lecture, we summarize the devi-ation of students in the teaching process , which can provide a reference for the use of micro-lecture in the university computer pub-lic course, and provide the basis for further research.

  14. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  15. Assessing the economic impact of public investment in Malaysia: a case study on MyRapid Transit project using a dynamic computable general equilibrium model

    OpenAIRE

    Muniandy, Meenachi

    2017-01-01

    The central focus of this thesis is the question of whether public investment in transport infrastructure contributes positively to Malaysia’s economic growth and welfare. Although there are strong analytical reasons to believe that public investment spending is one of the important variables that influence growth, there remains significant uncertainty about its actual degree of influence. In Malaysia, whenever there is a collapse in domestic demand, government spending becomes an important m...

  16. Flash在计算机中的设计与实现%Design and Implementation of Flash Public Service Ads in Computer

    Institute of Scientific and Technical Information of China (English)

    史耀进

    2012-01-01

    本文将广告理论与Flash广告相结合,介绍了Flash公益广告的设计与实现,并通过实例分析了Flash公益广告的特色.%This paper introduced the design and implementation of Flash public service advertising combined with the Flash ads and theory of the traditional advertising, and analyzed the characteristics of Flash public service advertising through the case.

  17. Public Speech.

    Science.gov (United States)

    Green, Thomas F.

    1994-01-01

    Discusses the importance of public speech in society, noting the power of public speech to create a world and a public. The paper offers a theory of public speech, identifies types of public speech, and types of public speech fallacies. Two ways of speaking of the public and of public life are distinguished. (SM)

  18. Public Infrastructure for Monte Carlo Simulation: publicMC@BATAN

    CERN Document Server

    Waskita, A A; Akbar, Z; Handoko, L T; 10.1063/1.3462759

    2010-01-01

    The first cluster-based public computing for Monte Carlo simulation in Indonesia is introduced. The system has been developed to enable public to perform Monte Carlo simulation on a parallel computer through an integrated and user friendly dynamic web interface. The beta version, so called publicMC@BATAN, has been released and implemented for internal users at the National Nuclear Energy Agency (BATAN). In this paper the concept and architecture of publicMC@BATAN are presented.

  19. Domain Engineering - A Software Engineering discipline in Need of Research

    DEFF Research Database (Denmark)

    Bjørner, Dines

    2000-01-01

    Before software can be developed its requirements must be stated. Before requirements can be expressed the application domain must be understood. In this paper we outline some of the basic facets of domain engineering. Domains seem, it is our experience, far more stable than computing requirements......, and these again seem more stable than software designs. Thus, almost like the universal laws of physics, it pays off to first develop theories of domains. But domain engineering, as in fact also requirements engineering, really is in need of thoroughly researched development principles, techniques and tools...... techniques. A brief example of describing stake-holder perspectives will be given - on the background of which we then proceed to survey the notions of domain intrinsics, domain support technologies, domain management & organisation, domain rules & regulations, domain human behaviour, etc. We show elsewhere...

  20. SELF LEARNING COMPUTER TROUBLESHOOTING EXPERT SYSTEM

    OpenAIRE

    Amanuel Ayde Ergado

    2016-01-01

    In computer domain the professionals were limited in number but the numbers of institutions looking for computer professionals were high. The aim of this study is developing self learning expert system which is providing troubleshooting information about problems occurred in the computer system for the information and communication technology technicians and computer users to solve problems effectively and efficiently to utilize computer and computer related resources. Domain know...

  1. 基于云计算的电子政务公共平台建设研究%Research on the Building of Electronic Government Public Platform Based on the Cloud Computing

    Institute of Scientific and Technical Information of China (English)

    吕小刚

    2016-01-01

    伴随着我国进入智慧政府建设阶段,政府信息化建设模式也从政府主导转向满足公众需求为核心点的需求导向、问题导向的全新模式。新形势下政府云平台的建设更好地整合硬件和服务资源,并实现资源的高效调度与公众需求最大化契合。文章通过对云计算相关概念的阐述,以“创新、协调、绿色、开放、共享”理念为基础,剖析出基于云计算的电子政务公共平台建设的意义,探索基于云计算电子政务公共平台建设的对策。%With our government developing into intelligent government, the information construction mode of government has transformed from government-led to a new mode that must be need for the public. Under the new situation government cloud platform construction is integrating the hardware resources and software resources together better, and dispatching all kinds of resources for the need of public. By stating the conception of the cloud computing, based on the ethos of“innovation, co-ordination, environment, opening, sharing”, the paper analyzes the meaning of the building of electronic government public platform based on the cloud computing and explores the strategy of building of electronic government public platform based on the cloud computing.

  2. A Cross-domain Authentication Protocol based on ID

    Directory of Open Access Journals (Sweden)

    Zheng Jun

    2013-01-01

    Full Text Available In large distributed networks, many computers must be mutual coordination to complete some works Under the certain conditions, these computers may come from different domains. For ensuring the safety to access resources among these computers in different domains, we propose a cross-domain union authentication scheme. We compute a large prime cyclic group by elliptic curve, and use the direct decomposition of this group to decompose automorphism groups ,and design an signcryption scheme between domains by bilinear of automorphism group to achieve cross-domain union authentication. this scheme overcome the complexity of certificate transmission and bottlenecks in the scheme of PKI-based, and it can trace the entities and supports two-way entities anonymous authentication, which avoid the authority counterfeiting its member to cross-domain access resources. Analyses show that its advantages on security and communication-consumption .

  3. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  4. Domains via Graphs

    Institute of Scientific and Technical Information of China (English)

    ZHANG Guoqiang; CHEN Yixiang

    2001-01-01

    This paper provides a concrete and simple introduction to two pillars of domain theory: (1) solving recursive domain equations, and (2) universal and saturated domains. Our exposition combines Larsen and Winskel's idea on solving domain equations using information systems with Girard's idea of stable domain theory in the form of coherence spaces, or graphs.Detailed constructions are given for universal and even homogeneous objects in two categories of graphs: one representing binary complete, prime algebraic domains with complete primes covering the bottom; the other representing ω-algebraic, prime algebraic lattices. The backand-forth argument in model theory helps to enlighten the constructions.

  5. Domain Theory for Concurrency

    DEFF Research Database (Denmark)

    Nygaard, Mikkel

    a process is capable. HOPLA can directly encode calculi like CCS, CCS with process passing, and mobile ambients with public names, and it can be given a straightforward operational semantics supporting a standard bisimulation congruence. The denotational and operational semantics are related with simple...... and associated comonads, it highlights the role of linearity in concurrent computation. Two choices of comonad yield two expressive metalanguages for higher-order processes, both arising from canonical constructions in the model. Their denotational semantics are fully abstract with respect to contextual...... input necessary for it. Such a semantics is provided, based on event structures; it agrees with the presheaf semantics at first order and exposes the tensor operation as a simple parallel composition of event structures. The categorical model obtained from presheaves is very rich in structure and points...

  6. A case for using grid architecture for state public health informatics: the Utah perspective

    Directory of Open Access Journals (Sweden)

    Rolfs Robert

    2009-06-01

    Full Text Available Abstract This paper presents the rationale for designing and implementing the next-generation of public health information systems using grid computing concepts and tools. Our attempt is to evaluate all grid types including data grids for sharing information and computational grids for accessing computational resources on demand. Public health is a broad domain that requires coordinated uses of disparate and heterogeneous information systems. System interoperability in public health is limited. The next-generation public health information systems must overcome barriers to integration and interoperability, leverage advances in information technology, address emerging requirements, and meet the needs of all stakeholders. Grid-based architecture provides one potential technical solution that deserves serious consideration. Within this context, we describe three discrete public health information system problems and the process by which the Utah Department of Health (UDOH and the Department of Biomedical Informatics at the University of Utah in the United States has approached the exploration for eventual deployment of a Utah Public Health Informatics Grid. These three problems are: i integration of internal and external data sources with analytic tools and computational resources; ii provide external stakeholders with access to public health data and services; and, iii access, integrate, and analyze internal data for the timely monitoring of population health status and health services. After one year of experience, we have successfully implemented federated queries across disparate administrative domains, and have identified challenges and potential solutions concerning the selection of candidate analytic grid services, data sharing concerns, security models, and strategies for reducing expertise required at a public health agency to implement a public health grid.

  7. Domains of laminin

    DEFF Research Database (Denmark)

    Engvall, E; Wewer, U M

    1996-01-01

    Extracellular matrix molecules are often very large and made up of several independent domains, frequently with autonomous activities. Laminin is no exception. A number of globular and rod-like domains can be identified in laminin and its isoforms by sequence analysis as well as by electron...... microscopy. Here we present the structure-function relations in laminins by examination of their individual domains. This approach to viewing laminin is based on recent results from several laboratories. First, some mutations in laminin genes that cause disease have affected single laminin domains, and some...... laminin isoforms lack particular domains. These mutants and isoforms are informative with regard to the activities of the mutated and missing domains. These mutants and isoforms are informative with regard to the activities of the mutated and missing domains. Second, laminin-like domains have now been...

  8. Weakly distributive domains(Ⅱ)

    Institute of Scientific and Technical Information of China (English)

    JIANG Ying; ZHANG Guo-Qiang

    2007-01-01

    In our previous work(Inform.and Comput.,2005,202:87-103),we have shown that for any ω-algebraic meet-cpo D,if all higher-order stable function spaces built from D are ω-algebraic,then D is finitary.This accomplishes the first of a possible,two-step process in solving the problem raised(LNCS,1991,530:16-33;Domainsand lambda-calculi,Cambridge Univ.Press,1998)whetherthe category of stable bifinite domains of Amadio-Droste-G(o)bel(LNCS,1991,530:16-33;Theor.Comput.Sci.,1993,111:89-101)is the largest cartesian closed full subcategory within the category of ω-algebraic meet-cpos with stable functions.This paper presents the results of the second step,which is to show that for any ω-algebraic meet-cpo D satisfying axioms M and I to be contained in a cartesian closed full sub-category using ω-algebraic meet-cpos with stable functions,it must not violate M I∞.We introduce a new class of domains called weakly distributive domains and show that for these domains to be in a cartesian closed category using ω-algebraic meet-cpos,property M I must not be violated.Further,we demonstrate that principally distributive domains(those for which each principle ideal is distributive)form a proper subclass of weakly distributive domains,and Birkhoff's M3 and N5(Introduction to Lattices and order,Cambridge Univ.Press,2002)are weakly distributive(but non-distributive).Then,we establish characterization results for weakly distributive domains.We also introduce the notion of meet-generators in constructing stable functions and show that if an ω-algebraic meet-cpo D contains an infinite number of meet-generators,then[D→D]fails I.However,the original problem of Amadio and Curien remains open.

  9. Trade name and trademark versus domain

    Directory of Open Access Journals (Sweden)

    Jarmila Pokorná

    2013-01-01

    Full Text Available Internet domains have become an integral part of our lives, so one can easily understand that during their use, conflicts can arise, whose participants will search for rules enabling resolution of conflicts. Since the domain name is a replacement of the computer IP address, in the technical sense of the word, this does not concern for domain names a commercial name or brand, because it primarily does not belong to a person in the legal sense of the word and does not serve for its individualization. The average user regularly affiliates domain names with a person offering goods or services on the relevant Website. Domain names used by entrepreneurs in their business activity are often chosen so that the second-level domain (SLD would use words that form the trade name of corporations formed of trading companies. This fact brings domain names close to such designations that serve the individualization of persons or products, especially the trademarks and the commercial name. Domains can come into conflict with the rights to designations, especially trademarks and commercial names. Court practice is resolving these conflicts using rules for unfair competition, or rules for protection of commercial names and trademarks, but it is not ruled out that in the future, special legal regulation of domain names could be established.

  10. Partial domain wall partition functions

    CERN Document Server

    Foda, O

    2012-01-01

    We consider six-vertex model configurations on a rectangular lattice with n (N) horizontal (vertical) lines, and "partial domain wall boundary conditions" defined as 1. all 2n arrows on the left and right boundaries point inwards, 2. n_u (n_l) arrows on the upper (lower) boundary, such that n_u + n_l = N - n, also point inwards, 3. all remaining n+N arrows on the upper and lower boundaries point outwards, and 4. all spin configurations on the upper and lower boundaries are summed over. To generate (n-by-N) "partial domain wall configurations", one can start from A. (N-by-N) configurations with domain wall boundary conditions and delete n_u (n_l) upper (lower) horizontal lines, or B. (2n-by-N) configurations that represent the scalar product of an n-magnon Bethe eigenstate and an n-magnon generic state on an N-site spin-1/2 chain, and delete the n lines that represent the Bethe eigenstate. The corresponding "partial domain wall partition function" is computed in construction {A} ({B}) as an N-by-N (n-by-n) det...

  11. A scoping review of cloud computing in healthcare.

    Science.gov (United States)

    Griebel, Lena; Prokosch, Hans-Ulrich; Köpcke, Felix; Toddenroth, Dennis; Christoph, Jan; Leb, Ines; Engel, Igor; Sedlmayr, Martin

    2015-03-19

    Cloud computing is a recent and fast growing area of development in healthcare. Ubiquitous, on-demand access to virtually endless resources in combination with a pay-per-use model allow for new ways of developing, delivering and using services. Cloud computing is often used in an "OMICS-context", e.g. for computing in genomics, proteomics and molecular medicine, while other field of application still seem to be underrepresented. Thus, the objective of this scoping review was to identify the current state and hot topics in research on cloud computing in healthcare beyond this traditional domain. MEDLINE was searched in July 2013 and in December 2014 for publications containing the terms "cloud computing" and "cloud-based". Each journal and conference article was categorized and summarized independently by two researchers who consolidated their findings. 102 publications have been analyzed and 6 main topics have been found: telemedicine/teleconsultation, medical imaging, public health and patient self-management, hospital management and information systems, therapy, and secondary use of data. Commonly used features are broad network access for sharing and accessing data and rapid elasticity to dynamically adapt to computing demands. Eight articles favor the pay-for-use characteristics of cloud-based services avoiding upfront investments. Nevertheless, while 22 articles present very general potentials of cloud computing in the medical domain and 66 articles describe conceptual or prototypic projects, only 14 articles report from successful implementations. Further, in many articles cloud computing is seen as an analogy to internet-/web-based data sharing and the characteristics of the particular cloud computing approach are unfortunately not really illustrated. Even though cloud computing in healthcare is of growing interest only few successful implementations yet exist and many papers just use the term "cloud" synonymously for "using virtual machines" or "web

  12. Evaluation of the TSC Dolphin Computer Assisted Instructional System in the Chapter 1 Program of the District of Columbia Public Schools. Final Report 85-9.

    Science.gov (United States)

    Harris, Carolyn DeMeyer; And Others

    Dolphin is a computer-assisted instruction system used to teach and reinforce skills in reading, language arts, and mathematics. An evaluation of this system was conducted to provide information to TSC Division of Houghton Mifflin regarding its effectiveness and possible modifications to the system. The general design of the evaluation was to…

  13. Two-Domain DNA Strand Displacement

    CERN Document Server

    Cardelli, Luca

    2010-01-01

    We investigate the computing power of a restricted class of DNA strand displacement structures: those that are made of double strands with nicks (interruptions) in the top strand. To preserve this structural invariant, we impose restrictions on the single strands they interact with: we consider only two-domain single strands consisting of one toehold domain and one recognition domain. We study fork and join signal-processing gates based on these structures, and we show that these systems are amenable to formalization and to mechanical verification.

  14. Using context to improve protein domain identification

    Directory of Open Access Journals (Sweden)

    Llinás Manuel

    2011-03-01

    Full Text Available Abstract Background Identifying domains in protein sequences is an important step in protein structural and functional annotation. Existing domain recognition methods typically evaluate each domain prediction independently of the rest. However, the majority of proteins are multidomain, and pairwise domain co-occurrences are highly specific and non-transitive. Results Here, we demonstrate how to exploit domain co-occurrence to boost weak domain predictions that appear in previously observed combinations, while penalizing higher confidence domains if such combinations have never been observed. Our framework, Domain Prediction Using Context (dPUC, incorporates pairwise "context" scores between domains, along with traditional domain scores and thresholds, and improves domain prediction across a variety of organisms from bacteria to protozoa and metazoa. Among the genomes we tested, dPUC is most successful at improving predictions for the poorly-annotated malaria parasite Plasmodium falciparum, for which over 38% of the genome is currently unannotated. Our approach enables high-confidence annotations in this organism and the identification of orthologs to many core machinery proteins conserved in all eukaryotes, including those involved in ribosomal assembly and other RNA processing events, which surprisingly had not been previously known. Conclusions Overall, our results demonstrate that this new context-based approach will provide significant improvements in domain and function prediction, especially for poorly understood genomes for which the need for additional annotations is greatest. Source code for the algorithm is available under a GPL open source license at http://compbio.cs.princeton.edu/dpuc/. Pre-computed results for our test organisms and a web server are also available at that location.

  15. 时域内多源动态载荷的一种计算反求技术%A COMPUTATIONAL INVERSE TECHNIQUE FOR RECONSTRUCTION OF MULTISOURCE LOADS IN TIME DOMAIN

    Institute of Scientific and Technical Information of China (English)

    韩旭; 刘杰; 李伟杰; 赵卓群

    2009-01-01

    signal. Moreover, a new extension algorithm is applied to improve the performance of the filter. Comparing with the common difference filter, this zero phase digital filter can not only avoid phase delaying, but also improve the wave aberration of the start and end section. After the investigation the ill-posedness arising from the inverse problem of load reconstruction, Tikhonov regularization, truncated singular value decomposition and total least squares method are adopted to provide efficient and numerically stable solution of the desired unknown load, and the L-curve method is proposed to determine the optimal regularization parameter. In order to avoid the inverse operation of the matrix, many optimized methods can be available and here the conjugate gradient method is adopted. In the numerical example, the reconstruction of dynamic loads from two sources with the noisy responses in the hood structure is investigated. The result indicates that the presented computational inverse technique is effective and stable for the load identification with the noisy response in time domain.%载荷在时域内可用一系列脉冲或阶跃的核函数来表示,系统的响应是载荷与核函数相对应响应的卷积分.在线性时不变的假设下,对系统动力响应的卷积分进行离散,并在此基础上分析载荷识别反问题的不适定性.针对测量的响应数据中存在噪声时载荷识别的困难,探讨了稳定近似识别载荷的一些方法,包括零相位滤波技术、几种正则化方法和优化策略.数值仿真算例表明,所述的载荷识别方法能够在响应数据含有噪声的情况下,有效稳定地实现多源动态载荷的重构.

  16. toolkit computational mesh conceptual model.

    Energy Technology Data Exchange (ETDEWEB)

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-03-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  17. Identifying the impact of G-quadruplexes on Affymetrix 3' arrays using cloud computing.

    Science.gov (United States)

    Memon, Farhat N; Owen, Anne M; Sanchez-Graillet, Olivia; Upton, Graham J G; Harrison, Andrew P

    2010-01-15

    A tetramer quadruplex structure is formed by four parallel strands of DNA/ RNA containing runs of guanine. These quadruplexes are able to form because guanine can Hoogsteen hydrogen bond to other guanines, and a tetrad of guanines can form a stable arrangement. Recently we have discovered that probes on Affymetrix GeneChips that contain runs of guanine do not measure gene expression reliably. We associate this finding with the likelihood that quadruplexes are forming on the surface of GeneChips. In order to cope with the rapidly expanding size of GeneChip array datasets in the public domain, we are exploring the use of cloud computing to replicate our experiments on 3' arrays to look at the effect of the location of G-spots (runs of guanines). Cloud computing is a recently introduced high-performance solution that takes advantage of the computational infrastructure of large organisations such as Amazon and Google. We expect that cloud computing will become widely adopted because it enables bioinformaticians to avoid capital expenditure on expensive computing resources and to only pay a cloud computing provider for what is used. Moreover, as well as financial efficiency, cloud computing is an ecologically-friendly technology, it enables efficient data-sharing and we expect it to be faster for development purposes. Here we propose the advantageous use of cloud computing to perform a large data-mining analysis of public domain 3' arrays.

  18. Public Values

    DEFF Research Database (Denmark)

    Beck Jørgensen, Torben; Rutgers, Mark R.

    2015-01-01

    administration is approached in terms of processes guided or restricted by public values and as public value creating: public management and public policy-making are both concerned with establishing, following and realizing public values. To study public values a broad perspective is needed. The article suggest......This article provides the introduction to a symposium on contemporary public values research. It is argued that the contribution to this symposium represent a Public Values Perspective, distinct from other specific lines of research that also use public value as a core concept. Public...... a research agenda for this encompasing kind of public values research. Finally the contributions to the symposium are introduced....

  19. An Empirical Research on Periodic Fluctuation of Public Opinion on Microblogging and Other Domains%微博舆情跨域周期波动实证研究

    Institute of Scientific and Technical Information of China (English)

    盛宇

    2015-01-01

    Combining the period of microblogging public opinion and the period of objective factors triggering the microblogging events, this paper conducts a cross-field research on periodic fluctuation of online and offline. The paper puts forward a complete set of research ideas and research methods, and conducts an empirical research to test the feasibility of the ideas, taking the haze event on microblogging as an example. The conclusion implies the interactive rules between the periods of microblogging public opinion and objective factors, and proposes a microblogging public opinion forecasting model. The study deepens the theory of periodic microblogging public opinion, and provides a new way for the early warning of microblogging public opinion, and the paper has significant values in both theoretical and practical aspects.%以生命周期理论为基础,将微博舆情周期与触发事件客观因素周期相联系,进行线上线下的跨域周期波动研究,提出一整套研究思路和研究方法,并以微博雾霾事件为例进行实证分析,考察了思路的可行性,实证结论揭示了微博雾霾舆情周期与客观因素周期之间的互动规律,提出微博舆情预测模型。深化了微博舆情周期理论,并为微博舆情预警提供了新思路,在理论性和实用性方面都具有重要价值。

  20. 公安院校计算机取证课程建设分析%Analysis on Course Construction of Computer Forensics in the Public Security Colleges

    Institute of Scientific and Technical Information of China (English)

    曹敏

    2016-01-01

    文章在分析了计算机取证现状和当前公安院校对信息网络安全监察专业迫切需求的基础上,提出了计算机取证课程建设方面的建议。主要从课程性质与定位、课程教学思路与建议以及以教学为基本职能的实验室建设展开讨论。%Based on the analysis of computer forensics status and urgent need for the network security supervision major in police academies, this paper puts forward construction suggestions for the course of computer forensics, discusses mainly on the aspects of nature and orientation, teaching ideas and suggestions, and the teaching lab construction.

  1. Quality Computer Assisted Mobile Learning (CAML) and Distance Education Leadership in Managing Technology Enhanced Learning Management System (TELMS) in the Malaysian Public Tertiary Education

    OpenAIRE

    Lee Tan Luck

    2009-01-01

    Abstract - The success in the implementation of a quality computer assisted mobile learning and distance education in a Technology Enhanced Learning Management System is highly rely on the academic leadership in managing and application of Information and Communication Technology (ICT) in the tertiary level. The effectiveness of its leadership, knowledge, application and management of ICT and learning management system is of utmost important. Successful application and management includes qua...

  2. Public Library Training Program for Older Adults Addresses Their Computer and Health Literacy Needs. A Review of: Xie, B. (2011. Improving older adults’ e-health literacy through computer training using NIH online resources. Library & Information Science Research, 34, 63-71. doi: /10.1016/j.lisr.2011.07.006

    Directory of Open Access Journals (Sweden)

    Cari Merkley

    2012-12-01

    Full Text Available Objective – To evaluate the efficacy of an ehealthliteracy educational intervention aimedat older adults.Design – Pre and post interventionquestionnaires administered in anexperimental study.Setting – Two public library branches inMaryland.Subjects – 218 adults between 60 and 89 yearsof age.Methods – A convenience sample of olderadults was recruited to participate in a fourweek training program structured around theNational Institutes of Health toolkit HelpingOlder Adults Search for Health InformationOnline. During the program, classes met at theparticipating libraries twice a week. Sessionswere two hours in length, and employedhands on exercises led by Master of LibraryScience students. The training included anintroduction to the Internet, as well as in depthtraining in the use of the NIHSeniorHealth andMedlinePlus websites. In the first class,participants were asked to complete a pretrainingquestionnaire that included questionsrelating to demographics and previouscomputer and Internet experience, as well asmeasures from the Computer Anxiety Scaleand two subscales of the Attitudes towardComputers Questionnaire. Participantsbetween September 2008 and June 2009 alsocompleted pre-training computer and web knowledge tests that asked individuals to label the parts of a computer and of a website using a provided list of terms. At the end of the program, participants were asked to complete post-training questionnaires that included the previously employed questions from the Computer Anxiety Scale and Attitudes towards Computer Questionnaire. New questions were added relating to the participants’ satisfaction with the training, its impact on their health decision making, their perceptions of public libraries, and the perceived usability and utility of the two websites highlighted during the training program. Those who completed pre-training knowledge tests were also asked to complete the same exercises at the end of the program.Main Results

  3. Análisis de dominio de la revista mexicana Investigación Bibliotecológica Domain analysis by the mexican publication Investigación Bibliotecológica

    Directory of Open Access Journals (Sweden)

    Félix de Moya-Anegón

    2001-12-01

    Full Text Available El análisis de dominio esta compuesto por un conjunto de metodologías que permiten delinear la estructura de relaciones existente en una determinada disciplina. El objetivo del presente trabajo es brindar un análisis de dominio de la disciplina Bibliotecología y Documentación (ByD en México. Para ello se analizará la bibliografía citaa por la revista Investigación Bibliotecológica (IB. Entre los elementos a analizar se encuentran: producción, autoría, coautoría, fuentes citadas y cocitación de revistas.The domain analysis involved a set of methods that show the structure of relations in a specific discipline. The aim of this paper is to provide the domain analysis of Library and Information Science (LIS in México. For that purpose, the bibliography of the Investigación Bibliotecológica (IB journal is analized. We analize the production, autorship, co-autorship, quoted sources, and joint quotation of journals.

  4. DATA TRANSFER FROM A DEC PDP-11 BASED MASS-SPECTROMETRY DATA STATION TO AN MS-DOS PERSONAL-COMPUTER

    NARCIS (Netherlands)

    RAFFAELLI, A; BRUINS, AP

    This paper describes a simple procedure for obtaining better quality graphic output for mass spectrometry data from data systems equipped with poor quality printing devices. The procedure uses KERMIT, a low cost public domain software, to transfer ASCII tables to a MS-DOS personal computer where

  5. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Science.gov (United States)

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  6. 校园网公共教学区域计算机桌面云的应用%The Application of the Computer Desktop Cloud in Public Teaching Area of Campus Network

    Institute of Scientific and Technical Information of China (English)

    王喆

    2016-01-01

    Abstracts:The computer maintenance in the public teaching area of campus network has always plagued those network adminis-trators. This paper presents that in the campus network by means of utilize phantom desktop cloud technology to realize the cen-tralizing management for the computers in public teaching area, which will improve computers’safety , make the application more environmentally friendly, and reduce the cost greatly for maintenance later.%校园网公共教学区域计算机维护问题一直困扰着网络管理员。本文介绍校园网内通过使用桌面云技术,对公共教学区域计算机进行集中化管理,提高了计算机使用安全性,使得应用更加环保,后期维护的成本大大降低。

  7. Translation domains in multiferroics

    OpenAIRE

    Meier, D; Leo, N; Jungk, T.; Soergel, E.; Becker, P.; Bohaty, L.; Fiebig, M.

    2010-01-01

    Translation domains differing in the phase but not in the orientation of the corresponding order parameter are resolved in two types of multiferroics. Hexagonal (h-) YMnO$_3$ is a split-order-parameter multiferroic in which commensurate ferroelectric translation domains are resolved by piezoresponse force microscopy whereas MnWO$_4$ is a joint-order-parameter multiferroic in which incommensurate magnetic translation domains are observed by optical second harmonic generation. The pronounced ma...

  8. Frustratingly Easy Domain Adaptation

    CERN Document Server

    Daumé, Hal

    2009-01-01

    We describe an approach to domain adaptation that is appropriate exactly in the case when one has enough ``target'' data to do slightly better than just using only ``source'' data. Our approach is incredibly simple, easy to implement as a preprocessing step (10 lines of Perl!) and outperforms state-of-the-art approaches on a range of datasets. Moreover, it is trivially extended to a multi-domain adaptation problem, where one has data from a variety of different domains.

  9. Staggered domain wall fermions

    CERN Document Server

    Hoelbling, Christian

    2016-01-01

    We construct domain wall fermions with a staggered kernel and investigate their spectral and chiral properties numerically in the Schwinger model. In some relevant cases we see an improvement of chirality by more than an order of magnitude as compared to usual domain wall fermions. Moreover, we present first results for four-dimensional quantum chromodynamics, where we also observe significant reductions of chiral symmetry violations for staggered domain wall fermions.

  10. Pragmatic circuits frequency domain

    CERN Document Server

    Eccles, William

    2006-01-01

    Pragmatic Circuits: Frequency Domain goes through the Laplace transform to get from the time domain to topics that include the s-plane, Bode diagrams, and the sinusoidal steady state. This second of three volumes ends with a-c power, which, although it is just a special case of the sinusoidal steady state, is an important topic with unique techniques and terminology. Pragmatic Circuits: Frequency Domain is focused on the frequency domain. In other words, time will no longer be the independent variable in our analysis. The two other volumes in the Pragmatic Circuits series include titles on DC

  11. 云计算时代下中小型出版社数字出版模式研究%Research on Digital Publishing Pattern for Small and Medium -sized Publication House under the Background of Cloud Computing Time

    Institute of Scientific and Technical Information of China (English)

    崔明; 段琳琳

    2011-01-01

    The way of the people to obtain information, the peoples life and the mode of thinking are changed greatly when the cloud computing is turned up. The transform of technology in IT often brings great changes in other industries. The change of people to obtain information and the reading habits, all of these turn the digital publishing into an irreversible trend of the Chinas publishing industry. Confronting with the high investment of the digital publishing, small and medium - sized publication house which is weak and has distinctive resource is anxious for digital publishing pattern that is suitable for them. This article comes up with the imagination to construct the digital publishing pattern for small and medium - sized publication house under the background of cloud computing time. We wish to give direction for small and medium - sized publication house to carry out the digital publishing.%云计算的出现改变了人们获取信息的方式,改变了人们的生活和思维方式.IT技术的变革往往带来其他行业的巨大变化.人们获取信息方式的改变,阅读习惯的变化,使得数字出版成为中国出版行业不可逆转的发展趋势.面对需要过高投入的数字出版,实力薄弱但拥有独特优质内容资源的中小型出版社急需适合自己的数字出版模式.以云计算时代为背景,提出构建中小型出版社数字出版模型的设想,希望给中小型出版社开展数字出版以指导.

  12. Cloud Computing: An Overview

    Science.gov (United States)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  13. Taking the Temperature of Pedestrian Movement in Public Spaces

    DEFF Research Database (Denmark)

    Nielsen, Søren Zebitz; Gade, Rikke; Moeslund, Thomas B.

    2014-01-01

    Cities require data on pedestrian movement to evaluate the use of public spaces. We propose a system using thermal cameras and Computer Vision (CV) combined with Geographical Information Systems (GIS) to track and assess pedestrian dynamics and behaviors in urban plazas. Thermal cameras operate...... independent of light and the technique is non-intrusive and preserves privacy. The approach extends the analysis to the GIS domain by capturing georeferenced tracks. We present a pilot study conducted in Copenhagen in 2013. The tracks retrieved by CV are compared to manually annotated ground truth tracks...

  14. Domain Relaxation in Langmuir Films

    Science.gov (United States)

    Bernoff, Andrew J.; Alexander, James C.; Mann, Elizabeth K.; Mann, J. Adin; Zou, Lu; Wintersmith, Jacob R.

    2007-11-01

    We report on an experimental, theoretical and computational study of a molecularly thin polymer Langmuir layer domain on the surface of a subfluid. When stretched (by a transient stagnation flow), the Langmuir layer takes the form of a bola consisting of two roughly circular reservoirs connected by a thin tether. This shape relaxes to the circular minimum energy configuration. The tether is never observed to rupture, even when it is more than a hundred times as long as it is thin. We model these experiments as a free boundary problem where motion is driven by the line tension of the domain and damped by the viscosity of the subfluid. We process the digital images of the experiment to extract the domain shape, use one of these shapes as an initial condition for the numerical solution of a boundary-integral model of the underlying hydrodynamics, and compare the subsequent images of the experiment to the numerical simulation. The numerical evolutions verify that our hydrodynamical model can reproduce the observed dynamics. They also allow us to deduce the magnitude of the line tension in the system, often to within 1%.

  15. Scalable Domain Decomposed Monte Carlo Particle Transport

    Energy Technology Data Exchange (ETDEWEB)

    O' Brien, Matthew Joseph [Univ. of California, Davis, CA (United States)

    2013-12-05

    In this dissertation, we present the parallel algorithms necessary to run domain decomposed Monte Carlo particle transport on large numbers of processors (millions of processors). Previous algorithms were not scalable, and the parallel overhead became more computationally costly than the numerical simulation.

  16. The enterprise engineering domain

    CSIR Research Space (South Africa)

    De Vries, M

    2015-06-01

    Full Text Available representation of the EE domain within the emerging EE discipline. We used a questionnaire to gather the views of EE and enterprise architecture (EA) researchers and practitioners on the EE domain. The main contributions of this article include: (1...

  17. Domain wall filters

    CERN Document Server

    Bär, O; Neuberger, H; Witzel, O; Baer, Oliver; Narayanan, Rajamani; Neuberger, Herbert; Witzel, Oliver

    2007-01-01

    We propose using the extra dimension separating the domain walls carrying lattice quarks of opposite handedness to gradually filter out the ultraviolet fluctuations of the gauge fields that are felt by the fermionic excitations living in the bulk. This generalization of the homogeneous domain wall construction has some theoretical features that seem nontrivial.

  18. Domain Walls on Singularities

    CERN Document Server

    Halyo, Edi

    2009-01-01

    We describe domain walls that live on $A_2$ and $A_3$ singularities. The walls are BPS if the singularity is resolved and non--BPS if it is deformed and fibered. We show that these domain walls may interpolate between vacua that support monopoles and/or vortices.

  19. Domains of Learning.

    Science.gov (United States)

    Gagne, Robert M.

    In planning educational research, recognition needs to be made of five domains of learning: (1) motor skills, (2) verbal information, (3) intellectual skills, (4) cognitive strategies, and (5) attitudes. In being cognizant of these domains, the researcher is able to distinguish the parts of a content area which are subject to different…

  20. A Domain Analysis Bibliography

    Science.gov (United States)

    1990-06-01

    Bauhaus , a prototype CASE workstation for D-SAPS development. [ARAN88A] Guillermo F. Arango. Domain Engineering for Software Reuse. PhD thesis...34 VITA90B: Domain Analysis within the ISEC Rapid Center 48 CMU/SEI-90-SR-3 Appendix III Alphabetical by Organization/Project BAUHAUS * ALLE87A

  1. Public lighting.

    NARCIS (Netherlands)

    Schreuder, D.A.

    1986-01-01

    The function of public lighting and the relationship between public lighting and accidents are considered briefly as aspects of effective countermeasures. Research needs and recent developments in installation and operational described. Public lighting is an efficient accident countermeasure, but

  2. Artificial intelligence and tutoring systems computational and cognitive approaches to the communication of knowledge

    CERN Document Server

    Wenger, Etienne

    2014-01-01

    Artificial Intelligence and Tutoring Systems: Computational and Cognitive Approaches to the Communication of Knowledge focuses on the cognitive approaches, methodologies, principles, and concepts involved in the communication of knowledge. The publication first elaborates on knowledge communication systems, basic issues, and tutorial dialogues. Concerns cover natural reasoning and tutorial dialogues, shift from local strategies to multiple mental models, domain knowledge, pedagogical knowledge, implicit versus explicit encoding of knowledge, knowledge communication, and practical and theoretic

  3. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  4. IEDA: Making Small Data BIG Through Interdisciplinary Partnerships Among Long-tail Domains

    Science.gov (United States)

    Lehnert, K. A.; Carbotte, S. M.; Arko, R. A.; Ferrini, V. L.; Hsu, L.; Song, L.; Ghiorso, M. S.; Walker, D. J.

    2014-12-01

    The Big Data world in the Earth Sciences so far exists primarily for disciplines that generate massive volumes of observational or computed data using large-scale, shared instrumentation such as global sensor networks, satellites, or high-performance computing facilities. These data are typically managed and curated by well-supported community data facilities that also provide the tools for exploring the data through visualization or statistical analysis. In many other domains, especially those where data are primarily acquired by individual investigators or small teams (known as 'Long-tail data'), data are poorly shared and integrated, lacking a community-based data infrastructure that ensures persistent access, quality control, standardization, and integration of data, as well as appropriate tools to fully explore and mine the data within the context of broader Earth Science datasets. IEDA (Integrated Earth Data Applications, www.iedadata.org) is a data facility funded by the US NSF to develop and operate data services that support data stewardship throughout the full life cycle of observational data in the solid earth sciences, with a focus on the data management needs of individual researchers. IEDA builds on a strong foundation of mature disciplinary data systems for marine geology and geophysics, geochemistry, and geochronology. These systems have dramatically advanced data resources in those long-tail Earth science domains. IEDA has strengthened these resources by establishing a consolidated, enterprise-grade infrastructure that is shared by the domain-specific data systems, and implementing joint data curation and data publication services that follow community standards. In recent years, other domain-specific data efforts have partnered with IEDA to take advantage of this infrastructure and improve data services to their respective communities with formal data publication, long-term preservation of data holdings, and better sustainability. IEDA hopes to

  5. 计算机组装与维护公共选修课教学探析%Analysis on Teaching of Computer Assembly and Maintenance Public Elective Course

    Institute of Scientific and Technical Information of China (English)

    刘延锋; 徐晓昭

    2013-01-01

    This paper points out that, in the higher vocational colleges in teaching computer assembly and maintenance public elective course for non computer majors, mainly exist the old curriculum system, lack of practice condition, lag of teaching mate-rials, single teaching mode problems, through the analysis of the problems, gives the corresponding teaching suggestions, make ap-propriate syllabus, apply the virtual machine technology to carry out the experiment and training, collect and supplement comput-er knowledge of new technology, use the various teaching methods;in addition, this paper also gives through social investigation, organize competition of computer assembly and maintenance, set up campus voluntary maintenance team and other activities in the spare time, to improve their computer assembly and maintenance skills.%该文指出了在高职院校针对非计算机专业学生开设计算机组装与维护公共选修课教学实践中主要存在的课程体系陈旧、实践实训条件匮乏、教材滞后、授课方式单一等问题,通过对问题的分析,给出了相应的教学建议,制定合适的教学大纲、应用虚拟机技术开展实验实训、搜集补充计算机新技术知识、灵活采用多种方式授课;此外,该文还给出了在课余时间通过社会实践调查、组织计算机组装与维护大赛、成立校园义务维修队等多种活动,切实提高学生的计算机组装维护技能。

  6. Computation and Learning in Visual Development

    Directory of Open Access Journals (Sweden)

    M Nardini

    2014-08-01

    Full Text Available In a special issue marking 30 years since the publication of Marr's Vision (Perception 41:9, 2012, Poggio proposed an update to Marr's influential “levels of understanding” framework. As well as understanding which algorithms are used for computations such as stereo or object recognition, we also need to understand how observers learn these algorithms, and how this learning is accomplished by neural circuits. I will describe research that addresses this problem in the domain of cue combination. In the last decade, linear cue combination has emerged as a common principle in visual and multisensory processing. In very many tasks, a computational goal (to minimise sensory uncertainty is achieved by the algorithm of weighted averaging. This framework provides a good description of observers' behaviour when combining sensory estimates (e.g. multiple depth cues. However, research has repeatedly shown that the computations carried out by developing perceptual systems – up to 8 years or later in humans – are not those leading to uncertainty reduction via weighted averaging. I will describe results showing how developing and mature perceptual systems differ in their computations when combining sensory cues, and outline two key problems for current and future research: 1. understanding the reorganisation of neural information processing that underlies these computational changes, and 2. understanding the learning mechanisms by which we acquire cue combination abilities through perceptual experience.

  7. Estimation of Polonium-210 activity in marine and terrestrial samples and computation of ingestion dose to the public in and around Kanyakumari coast, India

    Directory of Open Access Journals (Sweden)

    L. Macklin Rani

    2014-04-01

    Full Text Available The brown mussel Perna perna, an effective bioindicator species for monitoring radioactive pollution, was used to evaluate the concentration of 210Po in and around the coastal areas of Kanyakumari, a Monazite rich region. 210Po concentration in P. perna collected from ten different locations in this region exhibited values ranging between 78.09 ± 5.5 and 320.00 ± 18.1 Bq/kg (wet. Kalluvilai recorded the maximum concentration of 210Po (320.00 ± 18.1 Bq/kg, and hence further studies involving the activity of 210Po in other marine organisms and terrestrial samples were carried out from this site. The annual intake of 210Po by the population residing in this location via dietary sources was estimated. Similarly, the total annual committed effective dose to the public was found to be 2.24 mSv/year. The results obtained were compared to the values reported by earlier studies in India and also in other countries.

  8. Computational tools and resources for metabolism-related property predictions. 1. Overview of publicly available (free and commercial) databases and software.

    Science.gov (United States)

    Peach, Megan L; Zakharov, Alexey V; Liu, Ruifeng; Pugliese, Angelo; Tawa, Gregory; Wallqvist, Anders; Nicklaus, Marc C

    2012-10-01

    Metabolism has been identified as a defining factor in drug development success or failure because of its impact on many aspects of drug pharmacology, including bioavailability, half-life and toxicity. In this article, we provide an outline and descriptions of the resources for metabolism-related property predictions that are currently either freely or commercially available to the public. These resources include databases with data on, and software for prediction of, several end points: metabolite formation, sites of metabolic transformation, binding to metabolizing enzymes and metabolic stability. We attempt to place each tool in historical context and describe, wherever possible, the data it was based on. For predictions of interactions with metabolizing enzymes, we show a typical set of results for a small test set of compounds. Our aim is to give a clear overview of the areas and aspects of metabolism prediction in which the currently available resources are useful and accurate, and the areas in which they are inadequate or missing entirely.

  9. Masked object registration in the Fourier domain.

    Science.gov (United States)

    Padfield, Dirk

    2012-05-01

    Registration is one of the most common tasks of image analysis and computer vision applications. The requirements of most registration algorithms include large capture range and fast computation so that the algorithms are robust to different scenarios and can be computed in a reasonable amount of time. For these purposes, registration in the Fourier domain using normalized cross-correlation is well suited and has been extensively studied in the literature. Another common requirement is masking, which is necessary for applications where certain regions of the image that would adversely affect the registration result should be ignored. To address these requirements, we have derived a mathematical model that describes an exact form for embedding the masking step fully into the Fourier domain so that all steps of translation registration can be computed efficiently using Fast Fourier Transforms. We provide algorithms and implementation details that demonstrate the correctness of our derivations. We also demonstrate how this masked FFT registration approach can be applied to improve the Fourier-Mellin algorithm that calculates translation, rotation, and scale in the Fourier domain. We demonstrate the computational efficiency, advantages, and correctness of our algorithm on a number of images from real-world applications. Our framework enables fast, global, parameter-free registration of images with masked regions.

  10. New classes of domains with explicit Bergman kernel

    Institute of Scientific and Technical Information of China (English)

    YIN Weiping; LU Keping; Roos GUY

    2004-01-01

    We introduce two classes of egg type domains, built on general boundedsymmetric domains, for which we obtain the Bergman kernel inexplicit formulas. As an auxiliary tool, we compute the integralof complex powers of the generic norm on a bounded symmetricdomains using the well-known integral of Selberg. Thisgeneralizes matrix integrals of Hua and leads to a specialpolynomial with integer or half-integer coefficients attached toeach irreducible bounded symmetric domain.

  11. Public Broadcasting.

    Science.gov (United States)

    Shooshan, Harry M.; Arnheim, Louise

    This paper, the second in a series exploring future options for public policy in the communications and information arenas, examines some of the issues underlying public broadcasting, primarily public television. It advances two reasons why quality local public television programming is scarce: funds for the original production of programming have…

  12. Domain-Specific Multimodeling

    DEFF Research Database (Denmark)

    Hessellund, Anders

    Enterprise systems are complex artifacts. They are hard to build, manage, understand, and evolve. Existing software development paradigms fail to properly address challenges such as system size, domain complexity, and software evolution when development is scaled to enterprise systems. We propose...... domain-specific multimodeling as a development paradigm to tackle these challenges in a language-oriented manner. The different concerns of a system are conceptually separated and made explicit as independent domain-specific languages. This approach increases productivity and quality by raising...... the overall level of abstraction. It does, however, also introduce a new problem of coordinating multiple different languages in a single system. We call this problem the coordination problem. In this thesis, we present the coordination method for domain-specific multimodeling that explicitly targets...

  13. Conserved Domain Database (CDD)

    Data.gov (United States)

    U.S. Department of Health & Human Services — CDD is a protein annotation resource that consists of a collection of well-annotated multiple sequence alignment models for ancient domains and full-length proteins.

  14. Efficient integration method for fictitious domain approaches

    Science.gov (United States)

    Duczek, Sascha; Gabbert, Ulrich

    2015-10-01

    In the current article, we present an efficient and accurate numerical method for the integration of the system matrices in fictitious domain approaches such as the finite cell method (FCM). In the framework of the FCM, the physical domain is embedded in a geometrically larger domain of simple shape which is discretized using a regular Cartesian grid of cells. Therefore, a spacetree-based adaptive quadrature technique is normally deployed to resolve the geometry of the structure. Depending on the complexity of the structure under investigation this method accounts for most of the computational effort. To reduce the computational costs for computing the system matrices an efficient quadrature scheme based on the divergence theorem (Gauß-Ostrogradsky theorem) is proposed. Using this theorem the dimension of the integral is reduced by one, i.e. instead of solving the integral for the whole domain only its contour needs to be considered. In the current paper, we present the general principles of the integration method and its implementation. The results to several two-dimensional benchmark problems highlight its properties. The efficiency of the proposed method is compared to conventional spacetree-based integration techniques.

  15. Parallel algorithm for computing points on a computation front hyperplane

    Science.gov (United States)

    Krasnov, M. M.

    2015-01-01

    A parallel algorithm for computing points on a computation front hyperplane is described. This task arises in the computation of a quantity defined on a multidimensional rectangular domain. Three-dimensional domains are usually discussed, but the material is given in the general form when the number of measurements is at least two. When the values of a quantity at different points are internally independent (which is frequently the case), the corresponding computations are independent as well and can be performed in parallel. However, if there are internal dependences (as, for example, in the Gauss-Seidel method for systems of linear equations), then the order of scanning points of the domain is an important issue. A conventional approach in this case is to form a computation front hyperplane (a usual plane in the three-dimensional case and a line in the two-dimensional case) that moves linearly across the domain at a certain angle. At every step in the course of motion of this hyperplane, its intersection points with the domain can be treated independently and, hence, in parallel, but the steps themselves are executed sequentially. At different steps, the intersection of the hyperplane with the entire domain can have a rather complex geometry and the search for all points of the domain lying on the hyperplane at a given step is a nontrivial problem. This problem (i.e., the computation of the coordinates of points lying in the intersection of the domain with the hyperplane at a given step in the course of hyperplane motion) is addressed below. The computations over the points of the hyperplane can be executed in parallel.

  16. 云计算模式下的物流公共信息平台设计研究%Design of Logistics Public Information Platform under Cloud Computation

    Institute of Scientific and Technical Information of China (English)

    杨从亚; 徐海峰

    2013-01-01

    In this paper,in connection with the advantage of the cloud computation technology,we proposed a line of thinking in the design of the logistics public infornation platform.Starting from an analysis of the basic demand of the platform,we determined the basic positioning of the platform and then from the perspective of cloud computation,designed the functions and modules of the platform.%结合云计算技术的优势,提出了一种基于云计算模式的物流公共信息平台设计思路,从物流公共信息平台的基本需求分析人手,提出了平台设计的基本定位,并结合云计算模式,对物流公共信息平台各项系统功能和子系统模块进行了分析和设计,为系统开发与应用奠定了基础.

  17. Non-linear absorption of 1.3-μm wavelength femtosecond laser pulses focused inside semiconductors: Finite difference time domain-two temperature model combined computational study

    Science.gov (United States)

    Bogatyrev, I. B.; Grojo, D.; Delaporte, P.; Leyder, S.; Sentis, M.; Marine, W.; Itina, T. E.

    2011-11-01

    We present a theoretical model, which describes local energy deposition inside IR-transparent silicon and gallium arsenide with focused 1.3-μm wavelength femtosecond laser pulses. Our work relies on the ionization rate equation and two temperature model (TTM), as we simulate the non-linear propagation of focused femtosecond light pulses by using a 3D finite difference time domain method. We find a strong absorption dependence on the initial free electron density (doping concentration) that evidences the role of avalanche ionization. Despite an influence of Kerr-type self-focusing at intensity required for non-linear absorption, we show the laser energy deposition remains confined when the focus position is moved down to 1-mm below the surface. Our simulation results are in agreement with the degree of control observed in a simple model experiment.

  18. Intelligent Computer Graphics 2012

    CERN Document Server

    Miaoulis, Georgios

    2013-01-01

    In Computer Graphics, the use of intelligent techniques started more recently than in other research areas. However, during these last two decades, the use of intelligent Computer Graphics techniques is growing up year after year and more and more interesting techniques are presented in this area.   The purpose of this volume is to present current work of the Intelligent Computer Graphics community, a community growing up year after year. This volume is a kind of continuation of the previously published Springer volumes “Artificial Intelligence Techniques for Computer Graphics” (2008), “Intelligent Computer Graphics 2009” (2009), “Intelligent Computer Graphics 2010” (2010) and “Intelligent Computer Graphics 2011” (2011).   Usually, this kind of volume contains, every year, selected extended papers from the corresponding 3IA Conference of the year. However, the current volume is made from directly reviewed and selected papers, submitted for publication in the volume “Intelligent Computer Gr...

  19. Multigrid Algorithms for Domain-Wall Fermions

    CERN Document Server

    Cohen, Saul D; Clark, M A; Osborn, J C

    2012-01-01

    We describe an adaptive multigrid algorithm for solving inverses of the domain-wall fermion operator. Our multigrid algorithm uses an adaptive projection of near-null vectors of the domain-wall operator onto coarser four-dimensional lattices. This extension of multigrid techniques to a chiral fermion action will greatly reduce overall computation cost, and the elimination of the fifth dimension in the coarse space reduces the relative cost of using chiral fermions compared to discarding this symmetry. We demonstrate near-elimination of critical slowing as the quark mass is reduced and small volume dependence, which may be suppressed by taking advantage of the recursive nature of the algorithm.

  20. Strongly Semicontinuous Domains and Semi-FS Domains

    Directory of Open Access Journals (Sweden)

    Qingyu He

    2014-01-01

    Full Text Available We are mainly concerned with some special kinds of semicontinuous domains and relationships between them. New concepts of strongly semicontinuous domains, meet semicontinuous domains and semi-FS domains are introduced. It is shown that a dcpo L is strongly semicontinuous if and only if L is semicontinuous and meet semicontinuous. It is proved that semi-FS domains are strongly semicontinuous. Some interpolation properties of semiway-below relations in (strongly semicontinuous bc-domains are given. In terms of these properties, it is proved that strongly semicontinuous bc-domains, in particular strongly semicontinuous lattices, are all semi-FS domains.

  1. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and

  2. The Education Value of Cloud Computing

    Science.gov (United States)

    Katzan, Harry, Jr.

    2010-01-01

    Cloud computing is a technique for supplying computer facilities and providing access to software via the Internet. Cloud computing represents a contextual shift in how computers are provisioned and accessed. One of the defining characteristics of cloud software service is the transfer of control from the client domain to the service provider.…

  3. Quasicontinuous functions, domains, and extended calculus

    Directory of Open Access Journals (Sweden)

    Rodica Cazacu

    2007-04-01

    Full Text Available One of the aims of domain theory is the construction of an embedding of a given structure or data type as the maximal or “ideal” elements of an enveloping domain of “approximations,” sometimes called a domain environment. Typically the goal is to provide a computational model or framework for recursive and algorithmic reasoning about the original structure. In this paper we consider the function space of (natural equivalence classes of quasicontinuous functions from a locally compact space X into L, an n-fold product of the extended reals [−1,1] (more generally, into a bicontinuous lattice. We show that the domain of all “approximate maps” that assign to each point of X an order interval of L is a domain environment for the quasicontinuous function space. We rely upon the theory of domain environments to introduce an interesting and useful function space topology on the quasicontinuous function space. We then apply this machinery to define an extended differential calculus in the quasicontinuous function space, and draw connections with viscosity solutions of Hamiltonian equations. The theory depends heavily on topological properties of quasicontinuous functions that have been recently uncovered that involve dense sets of points of continuity and sections of closed relations and USCO maps. These and other basic results about quasicontinuous functions are surveyed and presented in the early sections.

  4. Structured hints : extracting and abstracting domain expertise.

    Energy Technology Data Exchange (ETDEWEB)

    Hereld, M.; Stevens, R.; Sterling, T.; Gao, G. R.; Mathematics and Computer Science; California Inst. of Tech.; Louisiana State Univ.; Univ. of Delaware

    2009-03-16

    We propose a new framework for providing information to help optimize domain-specific application codes. Its design addresses problems that derive from the widening gap between the domain problem statement by domain experts and the architectural details of new and future high-end computing systems. The design is particularly well suited to program execution models that incorporate dynamic adaptive methodologies for live tuning of program performance and resource utilization. This new framework, which we call 'structured hints', couples a vocabulary of annotations to a suite of performance metrics. The immediate target is development of a process by which a domain expert describes characteristics of objects and methods in the application code that would not be readily apparent to the compiler; the domain expert provides further information about what quantities might provide the best indications of desirable effect; and the interactive preprocessor identifies potential opportunities for the domain expert to evaluate. Our development of these ideas is progressing in stages from case study, through manual implementation, to automatic or semi-automatic implementation. In this paper we discuss results from our case study, an examination of a large simulation of a neural network modeled after the neocortex.

  5. Applications of Computer Algebra Conference

    CERN Document Server

    Martínez-Moro, Edgar

    2017-01-01

    The Applications of Computer Algebra (ACA) conference covers a wide range of topics from Coding Theory to Differential Algebra to Quantam Computing, focusing on the interactions of these and other areas with the discipline of Computer Algebra. This volume provides the latest developments in the field as well as its applications in various domains, including communications, modelling, and theoretical physics. The book will appeal to researchers and professors of computer algebra, applied mathematics, and computer science, as well as to engineers and computer scientists engaged in research and development.

  6. Bregmanized Domain Decomposition for Image Restoration

    KAUST Repository

    Langer, Andreas

    2012-05-22

    Computational problems of large-scale data are gaining attention recently due to better hardware and hence, higher dimensionality of images and data sets acquired in applications. In the last couple of years non-smooth minimization problems such as total variation minimization became increasingly important for the solution of these tasks. While being favorable due to the improved enhancement of images compared to smooth imaging approaches, non-smooth minimization problems typically scale badly with the dimension of the data. Hence, for large imaging problems solved by total variation minimization domain decomposition algorithms have been proposed, aiming to split one large problem into N > 1 smaller problems which can be solved on parallel CPUs. The N subproblems constitute constrained minimization problems, where the constraint enforces the support of the minimizer to be the respective subdomain. In this paper we discuss a fast computational algorithm to solve domain decomposition for total variation minimization. In particular, we accelerate the computation of the subproblems by nested Bregman iterations. We propose a Bregmanized Operator Splitting-Split Bregman (BOS-SB) algorithm, which enforces the restriction onto the respective subdomain by a Bregman iteration that is subsequently solved by a Split Bregman strategy. The computational performance of this new approach is discussed for its application to image inpainting and image deblurring. It turns out that the proposed new solution technique is up to three times faster than the iterative algorithm currently used in domain decomposition methods for total variation minimization. © Springer Science+Business Media, LLC 2012.

  7. Domains in Ferroelectric Nanostructures

    Science.gov (United States)

    Gregg, Marty

    2010-03-01

    Ferroelectric materials have great potential in influencing the future of small scale electronics. At a basic level, this is because ferroelectric surfaces are charged, and so interact strongly with charge-carrying metals and semiconductors - the building blocks for all electronic systems. Since the electrical polarity of the ferroelectric can be reversed, surfaces can both attract and repel charges in nearby materials, and can thereby exert complete control over both charge distribution and movement. It should be no surprise, therefore, that microelectronics industries have already looked very seriously at harnessing ferroelectric materials in a variety of applications, from solid state memory chips (FeRAMs) to field effect transistors (FeFETs). In all such applications, switching the direction of the polarity of the ferroelectric is a key aspect of functional behavior. The mechanism for switching involves the field-induced nucleation and growth of domains. Domain coarsening, through domain wall propagation, eventually causes the entire ferroelectric to switch its polar direction. It is thus the existence and behavior of domains that determine the switching response, and ultimately the performance of the ferroelectric device. A major issue, associated with the integration of ferroelectrics into microelectronic devices, has been that the fundamental properties associated with ferroelectrics, when in bulk form, appear to change quite dramatically and unpredictably when at the nanoscale: new modes of behaviour, and different functional characteristics from those seen in bulk appear. For domains, in particular, the proximity of surfaces and boundaries have a dramatic effect: surface tension and depolarizing fields both serve to increase the equilibrium density of domains, such that minor changes in scale or morphology can have major ramifications for domain redistribution. Given the importance of domains in dictating the overall switching characteristics of a device

  8. Censorship in Public Libraries.

    Science.gov (United States)

    Biggins, Barbara; Handsley, Elizabeth

    This paper discusses the legal obligations owed by librarians to the users of their facilities, focusing on the viewing of pornography on the Internet in the public library. The meanings commonly ascribed to the word censorship are presented. Australian federal law that governs the classification of films, videos, computer games, and publications…

  9. Unsharp Values, Domains and Topoi

    CERN Document Server

    Doering, Andreas

    2011-01-01

    The so-called topos approach provides a radical reformulation of quantum theory. Structurally, quantum theory in the topos formulation is very similar to classical physics. There is a state object, analogous to the state space of a classical system, and a quantity-value object, generalising the real numbers. Physical quantities are maps from the state object to the quantity-value object -- hence the `values' of physical quantities are not just real numbers in this formalism. Rather, they are families of real intervals, interpreted as `unsharp values'. We will motivate and explain these aspects of the topos approach and show that the structure of the quantity-value object can be analysed using tools from domain theory, a branch of order theory that originated in theoretical computer science. Moreover, the base category of the topos associated with a quantum system turns out to be a domain if the underlying von Neumann algebra is a matrix algebra. For general algebras, the base category still is a highly struct...

  10. Just how versatile are domains?

    Directory of Open Access Journals (Sweden)

    Bornberg-Bauer Erich

    2008-10-01

    Full Text Available Abstract Background Creating new protein domain arrangements is a frequent mechanism of evolutionary innovation. While some domains always form the same combinations, others form many different arrangements. This ability, which is often referred to as versatility or promiscuity of domains, its a random evolutionary model in which a domain's promiscuity is based on its relative frequency of domains. Results We show that there is a clear relationship across genomes between the promiscuity of a given domain and its frequency. However, the strength of this relationship differs for different domains. We thus redefine domain promiscuity by defining a new index, DV I ("domain versatility index", which eliminates the effect of domain frequency. We explore links between a domain's versatility, when unlinked from abundance, and its biological properties. Conclusion Our results indicate that domains occurring as single domain proteins and domains appearing frequently at protein termini have a higher DV I. This is consistent with previous observations that the evolution of domain re-arrangements is primarily driven by fusion of pre-existing arrangements and single domains as well as loss of domains at protein termini. Furthermore, we studied the link between domain age, defined as the first appearance of a domain in the species tree, and the DV I. Contrary to previous studies based on domain promiscuity, it seems as if the DV I is age independent. Finally, we find that contrary to previously reported findings, versatility is lower in Eukaryotes. In summary, our measure of domain versatility indicates that a random attachment process is sufficient to explain the observed distribution of domain arrangements and that several views on domain promiscuity need to be revised.

  11. Public Access Technologies in Public Libraries: Effects and Implications

    Directory of Open Access Journals (Sweden)

    John Carlo Bertot

    2009-06-01

    Full Text Available Public libraries were early adopters of Internet-based technologies and have provided public access to the Internet and computers since the early 1990s. The landscape of public-access Internet and computing was substantially different in the 1990s as the World Wide Web was only in its initial development. At that time, public libraries essentially experimented with publicaccess Internet and computer services, largely absorbing this service into existing service and resource provision without substantial consideration of the management, facilities, staffing, and other implications of public-access technology (PAT services and resources. This article explores the implications for public libraries of the provision of PAT and seeks to look further to review issues and practices associated with PAT provision resources. While much research focuses on the amount of public access that public libraries provide, little offers a view of the effect of public access on libraries. This article provides insights into some of the costs, issues, and challenges associated with public access and concludes with recommendations that require continued exploration.

  12. Environmental computing compendium - background and motivation

    Science.gov (United States)

    Heikkurinen, Matti; Kranzlmüller, Dieter

    2017-04-01

    The emerging discipline of environmental computing brings together experts in applied, advanced environmental modelling. The application domains address several fundamental societal challenges, ranging from disaster risk reduction to sustainability issues (such as food security on the global scale). The community has used an Intuitive, pragmatic approach when determining which initiatives are considered to "belong to the discipline". The community's growth is based on sharing of experiences and tools provides opportunities for reusing solutions or applying knowledge in new settings. Thus, limiting possible synergies by applying an arbitrary, formal definition to exclude some of the sources of solutions and knowledge would be counterproductive. However, the number of individuals and initiatives involved has grown to the level where a survey of initiatives and sub-themes they focus on is of interest. By surveying the project landscape and identifying common themes and building a shared vocabulary to describe them we can both communicate the relevance of the new discipline to the general public more easily and make it easier for the new members of the community to find the most promising collaboration partners. This talk presents the methodology and initial findings of the initial survey of the environmental computing initiatives and organisations, as well as approaches that could lead to an environmental computing compendium that would be a collaborative maintained shared resource of the environmental computing community.

  13. Hamilton-Jacobi method for curved domain walls and cosmologies

    Science.gov (United States)

    Skenderis, Kostas; Townsend, Paul K.

    2006-12-01

    We use Hamiltonian methods to study curved domain walls and cosmologies. This leads naturally to first-order equations for all domain walls and cosmologies foliated by slices of maximal symmetry. For Minkowski and AdS-sliced domain walls (flat and closed FLRW cosmologies) we recover a recent result concerning their (pseudo)supersymmetry. We show how domain-wall stability is consistent with the instability of AdS vacua that violate the Breitenlohner-Freedman bound. We also explore the relationship to Hamilton-Jacobi theory and compute the wave-function of a 3-dimensional closed universe evolving towards de Sitter spacetime.

  14. Public Schools

    Data.gov (United States)

    Department of Homeland Security — This Public Schools feature dataset is composed of all Public elementary and secondary education in the United States as defined by the Common Core of Data, National...

  15. Computer simulations suggest direct and stable tip to tip interaction between the outer membrane channel TolC and the isolated docking domain of the multidrug RND efflux transporter AcrB.

    Science.gov (United States)

    Schmidt, Thomas H; Raunest, Martin; Fischer, Nadine; Reith, Dirk; Kandt, Christian

    2016-07-01

    One way by which bacteria achieve antibiotics resistance is preventing drug access to its target molecule for example through an overproduction of multi-drug efflux pumps of the resistance nodulation division (RND) protein super family of which AcrAB-TolC in Escherichia coli is a prominent example. Although representing one of the best studied efflux systems, the question of how AcrB and TolC interact is still unclear as the available experimental data suggest that either both proteins interact in a tip to tip manner or do not interact at all but are instead connected by a hexamer of AcrA molecules. Addressing the question of TolC-AcrB interaction, we performed a series of 100 ns - 1 µs-molecular dynamics simulations of membrane-embedded TolC in presence of the isolated AcrB docking domain (AcrB(DD)). In 5/6 simulations we observe direct TolC-AcrB(DD) interaction that is only stable on the simulated time scale when both proteins engage in a tip to tip manner. At the same time we find TolC opening and closing freely on extracellular side while remaining closed at the inner periplasmic bottleneck region, suggesting that either the simulated time is too short or additional components are required to unlock TolC.

  16. ProDomAs, protein domain assignment algorithm using center-based clustering and independent dominating set.

    Science.gov (United States)

    Ansari, Elnaz Saberi; Eslahchi, Changiz; Pezeshk, Hamid; Sadeghi, Mehdi

    2014-09-01

    Decomposition of structural domains is an essential task in classifying protein structures, predicting protein function, and many other proteomics problems. As the number of known protein structures in PDB grows exponentially, the need for accurate automatic domain decomposition methods becomes more essential. In this article, we introduce a bottom-up algorithm for assigning protein domains using a graph theoretical approach. This algorithm is based on a center-based clustering approach. For constructing initial clusters, members of an independent dominating set for the graph representation of a protein are considered as the centers. A distance matrix is then defined for these clusters. To obtain final domains, these clusters are merged using the compactness principle of domains and a method similar to the neighbor-joining algorithm considering some thresholds. The thresholds are computed using a training set consisting of 50 protein chains. The algorithm is implemented using C++ language and is named ProDomAs. To assess the performance of ProDomAs, its results are compared with seven automatic methods, against five publicly available benchmarks. The results show that ProDomAs outperforms other methods applied on the mentioned benchmarks. The performance of ProDomAs is also evaluated against 6342 chains obtained from ASTRAL SCOP 1.71. ProDomAs is freely available at http://www.bioinf.cs.ipm.ir/software/prodomas. © 2014 Wiley Periodicals, Inc.

  17. Axion domain wall baryogenesis

    Energy Technology Data Exchange (ETDEWEB)

    Daido, Ryuji; Kitajima, Naoya [Department of Physics, Tohoku University,Sendai 980-8578 (Japan); Takahashi, Fuminobu [Department of Physics, Tohoku University,Sendai 980-8578 (Japan); Kavli IPMU, TODIAS, University of Tokyo,Kashiwa 277-8583 (Japan)

    2015-07-28

    We propose a new scenario of baryogenesis, in which annihilation of axion domain walls generates a sizable baryon asymmetry. Successful baryogenesis is possible for a wide range of the axion mass and decay constant, m≃10{sup 8}–10{sup 13} GeV and f≃10{sup 13}–10{sup 16} GeV. Baryonic isocurvature perturbations are significantly suppressed in our model, in contrast to various spontaneous baryogenesis scenarios in the slow-roll regime. In particular, the axion domain wall baryogenesis is consistent with high-scale inflation which generates a large tensor-to-scalar ratio within the reach of future CMB B-mode experiments. We also discuss the gravitational waves produced by the domain wall annihilation and its implications for the future gravitational wave experiments.

  18. Cloud computing and services science

    NARCIS (Netherlands)

    Ivanov, Ivan; Sinderen, van Marten; Shishkov, Boris

    2012-01-01

    This book is essentially a collection of the best papers of the International Conference on Cloud Computing and Services Science (CLOSER), which was held in Noordwijkerhout, The Netherlands on May 7–9, 2011. The conference addressed technology trends in the domain of cloud computing in relation to a

  19. Public lighting.

    NARCIS (Netherlands)

    2011-01-01

    Visual perception is very important for road users and in the dark it can be facilitated by public lighting. Public lighting has a mostly positive road safety effect. Installing public lighting on roads that were previously unlit generally results in fewer and less serious crashes. This effect seems

  20. Time domain Rankine-Green panel method for offshore structures

    Science.gov (United States)

    Li, Zhifu; Ren, Huilong; Liu, Riming; Li, Hui

    2017-02-01

    To solve the numerical divergence problem of the direct time domain Green function method for the motion simulation of floating bodies with large flare, a time domain hybrid Rankine-Green boundary element method is proposed. In this numerical method, the fluid domain is decomposed by an imaginary control surface, at which the continuous condition should be satisfied. Then the Rankine Green function is adopted in the inner domain. The transient free surface Green function is applied in the outer domain, which is used to find the relationship between the velocity potential and its normal derivative for the inner domain. Besides, the velocity potential at the mean free surface between body surface and control surface is directly solved by the integration scheme. The wave exciting force is computed through the convolution integration with wave elevation, by introducing the impulse response function. Additionally, the nonlinear Froude-Krylov force and hydrostatic force, which is computed under the instantaneous incident wave free surface, are taken into account by the direct pressure integration scheme. The corresponding numerical computer code is developed and first used to compute the hydrodynamic coefficients of the hemisphere, as well as the time history of a ship with large flare; good agreement is obtained with the analytical solutions as well as the available numerical results. Then the hydrodynamic properties of a FPSO are studied. The hydrodynamic coefficients agree well with the results computed by the frequency method; the influence of the time interval and the truncated time is investigated in detail.