WorldWideScience

Sample records for computing hpcwpl final

  1. Energy efficiency of computer power supply units - Final report

    Energy Technology Data Exchange (ETDEWEB)

    Aebischer, B. [cepe - Centre for Energy Policy and Economics, Swiss Federal Institute of Technology Zuerich, Zuerich (Switzerland); Huser, H. [Encontrol GmbH, Niederrohrdorf (Switzerland)

    2002-11-15

    This final report for the Swiss Federal Office of Energy (SFOE) takes a look at the efficiency of computer power supply units, which decreases rapidly during average computer use. The background and the purpose of the project are examined. The power supplies for personal computers are discussed and the testing arrangement used is described. Efficiency, power-factor and operating points of the units are examined. Potentials for improvement and measures to be taken are discussed. Also, action to be taken by those involved in the design and operation of such power units is proposed. Finally, recommendations for further work are made.

  2. National Computational Infrastructure for Lattice Gauge Theory: Final Report

    International Nuclear Information System (INIS)

    Richard Brower; Norman Christ; Michael Creutz; Paul Mackenzie; John Negele; Claudio Rebbi; David Richards; Stephen Sharpe; Robert Sugar

    2006-01-01

    This is the final report of Department of Energy SciDAC Grant ''National Computational Infrastructure for Lattice Gauge Theory''. It describes the software developed under this grant, which enables the effective use of a wide variety of supercomputers for the study of lattice quantum chromodynamics (lattice QCD). It also describes the research on and development of commodity clusters optimized for the study of QCD. Finally, it provides some high lights of research enabled by the infrastructure created under this grant, as well as a full list of the papers resulting from research that made use of this infrastructure

  3. Summer 1994 Computational Science Workshop. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    This report documents the work performed by the University of New Mexico Principal Investigators and Research Assistants while hosting the highly successful Summer 1994 Computational Sciences Workshop in Albuquerque on August 6--11, 1994. Included in this report is a final budget for the workshop, along with a summary of the participants` evaluation of the workshop. The workshop proceeding have been delivered under separate cover. In order to assist in the organization of future workshops, we have also included in this report detailed documentation of the pre- and post-workshop activities associated with this contract. Specifically, we have included a section that documents the advertising performed, along with the manner in which applications were handled. A complete list of the workshop participants in this section. Sample letters that were generated while dealing with various commercial entities and departments at the University are also included in a section dealing with workshop logistics. Finally, we have included a section in this report that deals with suggestions for future workshops.

  4. SIAM Conference on Geometric Design and Computing. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-03-11

    The SIAM Conference on Geometric Design and Computing attracted 164 domestic and international researchers, from academia, industry, and government. It provided a stimulating forum in which to learn about the latest developments, to discuss exciting new research directions, and to forge stronger ties between theory and applications. Final Report

  5. UC Merced Center for Computational Biology Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Colvin, Michael; Watanabe, Masakatsu

    2010-11-30

    Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformation of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs

  6. Final Report: Correctness Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2014-10-27

    In the course of developing parallel programs for leadership computing systems, subtle programming errors often arise that are extremely difficult to diagnose without tools. To meet this challenge, University of Maryland, the University of Wisconsin—Madison, and Rice University worked to develop lightweight tools to help code developers pinpoint a variety of program correctness errors that plague parallel scientific codes. The aim of this project was to develop software tools that help diagnose program errors including memory leaks, memory access errors, round-off errors, and data races. Research at Rice University focused on developing algorithms and data structures to support efficient monitoring of multithreaded programs for memory access errors and data races. This is a final report about research and development work at Rice University as part of this project.

  7. Computer-aided dispatch--traffic management center field operational test : Washington State final report

    Science.gov (United States)

    2006-05-01

    This document provides the final report for the evaluation of the USDOT-sponsored Computer-Aided Dispatch - Traffic Management Center Integration Field Operations Test in the State of Washington. The document discusses evaluation findings in the foll...

  8. Low-cost addition-subtraction sequences for the final exponentiation computation in pairings

    DEFF Research Database (Denmark)

    Guzmán-Trampe, Juan E; Cruz-Cortéz, Nareli; Dominguez Perez, Luis

    2014-01-01

    In this paper, we address the problem of finding low cost addition–subtraction sequences for situations where a doubling step is significantly cheaper than a non-doubling one. One application of this setting appears in the computation of the final exponentiation step of the reduced Tate pairing d...

  9. Computer-aided dispatch--traffic management center field operational test : state of Utah final report

    Science.gov (United States)

    2006-07-01

    This document provides the final report for the evaluation of the USDOT-sponsored Computer-Aided Dispatch Traffic Management Center Integration Field Operations Test in the State of Utah. The document discusses evaluation findings in the followin...

  10. 75 FR 32803 - Notice of Issuance of Final Determination Concerning a GTX Mobile+ Hand Held Computer

    Science.gov (United States)

    2010-06-09

    ... shall be published in the Federal Register within 60 days of the date the final determination is issued..., involved various scenarios pertaining to the assembly of a desktop computer in the U.S. and the Netherlands... finished desktop computers depending on the model included an additional floppy drive, CD ROM disk, and...

  11. Computer-aided dispatch--traffic management center field operational test final detailed test plan : WSDOT deployment

    Science.gov (United States)

    2003-10-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : WSDOT deployment". This document defines the objective, approach,...

  12. Computer-aided dispatch--traffic management center field operational test final test plans : state of Utah

    Science.gov (United States)

    2004-01-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : state of Utah". This document defines the objective, approach, an...

  13. Biomedical Computing Technology Information Center (BCTIC): Final progress report, March 1, 1986-September 30, 1986

    International Nuclear Information System (INIS)

    1987-01-01

    During this time, BCTIC packaged and disseminated computing technology and honored all requests made before September 1, 1986. The final month of operation was devoted to completing code requests, returning submitted codes, and sending out notices of BCTIC's termination of services on September 30th. Final BCTIC library listings were distributed to members of the active mailing list. Also included in the library listing are names and addresses of program authors and contributors in order that users may have continued support of their programs. The BCTIC library list is attached

  14. Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology (Final Report)

    Science.gov (United States)

    EPA announced the release of the final report, Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology. This report describes new approaches that are faster, less resource intensive, and more robust that can help ...

  15. The Magellan Final Report on Cloud Computing

    Energy Technology Data Exchange (ETDEWEB)

    ,; Coghlan, Susan; Yelick, Katherine

    2011-12-21

    The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computing Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.

  16. Computational Particle Dynamic Simulations on Multicore Processors (CPDMu) Final Report Phase I

    Energy Technology Data Exchange (ETDEWEB)

    Schmalz, Mark S

    2011-07-24

    Statement of Problem - Department of Energy has many legacy codes for simulation of computational particle dynamics and computational fluid dynamics applications that are designed to run on sequential processors and are not easily parallelized. Emerging high-performance computing architectures employ massively parallel multicore architectures (e.g., graphics processing units) to increase throughput. Parallelization of legacy simulation codes is a high priority, to achieve compatibility, efficiency, accuracy, and extensibility. General Statement of Solution - A legacy simulation application designed for implementation on mainly-sequential processors has been represented as a graph G. Mathematical transformations, applied to G, produce a graph representation {und G} for a high-performance architecture. Key computational and data movement kernels of the application were analyzed/optimized for parallel execution using the mapping G {yields} {und G}, which can be performed semi-automatically. This approach is widely applicable to many types of high-performance computing systems, such as graphics processing units or clusters comprised of nodes that contain one or more such units. Phase I Accomplishments - Phase I research decomposed/profiled computational particle dynamics simulation code for rocket fuel combustion into low and high computational cost regions (respectively, mainly sequential and mainly parallel kernels), with analysis of space and time complexity. Using the research team's expertise in algorithm-to-architecture mappings, the high-cost kernels were transformed, parallelized, and implemented on Nvidia Fermi GPUs. Measured speedups (GPU with respect to single-core CPU) were approximately 20-32X for realistic model parameters, without final optimization. Error analysis showed no loss of computational accuracy. Commercial Applications and Other Benefits - The proposed research will constitute a breakthrough in solution of problems related to efficient

  17. Computer Assisted Instructional Design for Computer-Based Instruction. Final Report. Working Papers.

    Science.gov (United States)

    Russell, Daniel M.; Pirolli, Peter

    Recent advances in artificial intelligence and the cognitive sciences have made it possible to develop successful intelligent computer-aided instructional systems for technical and scientific training. In addition, computer-aided design (CAD) environments that support the rapid development of such computer-based instruction have also been recently…

  18. The establishment of computer codes for radiological assessment on LLW final disposal in Taiwan

    International Nuclear Information System (INIS)

    Yang, C.C.; Chen, H.T.; Shih, C.L.; Yeh, C.S.; Tsai, C.M.

    1988-01-01

    For final shallow land disposal of Low Level Waste (LLW) in Taiwan, an effort was initiated to establish the evaluation codes for the needs of environmental impact analysis. The objective of the computer code is to set up generic radiological standards for future evaluation on 10 CFR Part 61 Licensing Requirements for Land Disposal of Radioactive Wastes. In determining long-term influences resulting from radiological impacts of LLW at disposal sites there are at least three quantifiable impact measures selected for calculation: dose to members of the public (individual and population), occupational exposures and costs. The computer codes are from INTRUDE, INVERSI and INVERSW of NUREG-0782, OPTIONR and GRWATRR of NUREG-0945. They are both installed in FACOM-M200 and IBM PC/AT systems of Institute of Nuclear Energy Research (INER). The systematic analysis of the computer codes depends not only on the data bases supported by NUREG/CR-1759 - Data Base for Radioactive Waste Management, Volume 3, Impact Analysis Methodology Report but also the information collected from the different exposure scenarios and pathways. The sensitivity study is also performed to assure the long-term stability and security for needs of determining performance objectives

  19. CRITICAL ISSUES IN HIGH END COMPUTING - FINAL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Corones, James [Krell Institute

    2013-09-23

    High-End computing (HEC) has been a driver for advances in science and engineering for the past four decades. Increasingly HEC has become a significant element in the national security, economic vitality, and competitiveness of the United States. Advances in HEC provide results that cut across traditional disciplinary and organizational boundaries. This program provides opportunities to share information about HEC systems and computational techniques across multiple disciplines and organizations through conferences and exhibitions of HEC advances held in Washington DC so that mission agency staff, scientists, and industry can come together with White House, Congressional and Legislative staff in an environment conducive to the sharing of technical information, accomplishments, goals, and plans. A common thread across this series of conferences is the understanding of computational science and applied mathematics techniques across a diverse set of application areas of interest to the Nation. The specific objectives of this program are: Program Objective 1. To provide opportunities to share information about advances in high-end computing systems and computational techniques between mission critical agencies, agency laboratories, academics, and industry. Program Objective 2. To gather pertinent data, address specific topics of wide interest to mission critical agencies. Program Objective 3. To promote a continuing discussion of critical issues in high-end computing. Program Objective 4.To provide a venue where a multidisciplinary scientific audience can discuss the difficulties applying computational science techniques to specific problems and can specify future research that, if successful, will eliminate these problems.

  20. Hyperacute stroke patients and catheter thrombolysis therapy. Correlation between computed tomography perfusion maps and final infarction

    International Nuclear Information System (INIS)

    Naito, Yukari; Tanaka, Shigeko; Inoue, Yuichi; Ota, Shinsuke; Sakaki, Saburo; Kitagaki, Hajime

    2008-01-01

    We investigated the correlation between abnormal perfusion areas by computed tomography perfusion (CTP) study of hyperacute stroke patients and the final infarction areas after intraarterial catheter thrombolysis. CTP study using the box-modulation transfer function (box-MTF) method based on the deconvolution analysis method was performed in 22 hyperacute stroke patients. Ischemic lesions were immediately treated with catheter thrombolysis after CTP study. Among them, nine patients with middle cerebral artery (MCA) occlusion were investigated regarding correlations of the size of the prolonged mean transit time (MTT) area, the decreased cerebral blood volume (CBV) area, and the final infarction area. Using the box-MTF method, the prolonged MTT area was almost identical to the final infarction area in the case of catheter thrombolysis failure. The decreased CBV areas resulted in infarction or hemorrhage, irrespective of the outcome of recanalization after catheter thrombolysis. The prolonged MTT areas, detected by the box-MTF method of CTP in hyperacute stroke patients, included the area of true prolonged MTT and the tracer delay. The prolonged MTT area was almost identical to the final infarction area when recanalization failed. We believe that a tracer delay area also indicates infarction in cases of thrombolysis failure. (author)

  1. Technologies and tools for high-performance distributed computing. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Karonis, Nicholas T.

    2000-05-01

    In this project we studied the practical use of the MPI message-passing interface in advanced distributed computing environments. We built on the existing software infrastructure provided by the Globus Toolkit{trademark}, the MPICH portable implementation of MPI, and the MPICH-G integration of MPICH with Globus. As a result of this project we have replaced MPICH-G with its successor MPICH-G2, which is also an integration of MPICH with Globus. MPICH-G2 delivers significant improvements in message passing performance when compared to its predecessor MPICH-G and was based on superior software design principles resulting in a software base that was much easier to make the functional extensions and improvements we did. Using Globus services we replaced the default implementation of MPI's collective operations in MPICH-G2 with more efficient multilevel topology-aware collective operations which, in turn, led to the development of a new timing methodology for broadcasts [8]. MPICH-G2 was extended to include client/server functionality from the MPI-2 standard [23] to facilitate remote visualization applications and, through the use of MPI idioms, MPICH-G2 provided application-level control of quality-of-service parameters as well as application-level discovery of underlying Grid-topology information. Finally, MPICH-G2 was successfully used in a number of applications including an award-winning record-setting computation in numerical relativity. In the sections that follow we describe in detail the accomplishments of this project, we present experimental results quantifying the performance improvements, and conclude with a discussion of our applications experiences. This project resulted in a significant increase in the utility of MPICH-G2.

  2. Computer-Based Self-Instructional Modules. Final Technical Report.

    Science.gov (United States)

    Weinstock, Harold

    Reported is a project involving seven chemists, six mathematicians, and six physicists in the production of computer-based, self-study modules for use in introductory college courses in chemistry, physics, and mathematics. These modules were designed to be used by students and instructors with little or no computer backgrounds, in institutions…

  3. Information-preserving models of physics and computation: Final report

    International Nuclear Information System (INIS)

    1986-01-01

    This research pertains to discrete dynamical systems, as embodied by cellular automata, reversible finite-difference equations, and reversible computation. The research has strengthened the cross-fertilization between physics, computer science and discrete mathematics. It has shown that methods and concepts of physics can be exported to computation. Conversely, fully discrete dynamical systems have been shown to be fruitful for representing physical phenomena usually described with differential equations - cellular automata for fluid dynamics has been the most noted example of such a representation. At the practical level, the fully discrete representation approach suggests innovative uses of computers for scientific computing. The originality of these uses lies in their non-numerical nature: they avoid the inaccuracies of floating-point arithmetic and bypass the need for numerical analysis. 38 refs

  4. Advanced computational methods for the assessment of reactor core behaviour during reactivity initiated accidents. Final report; Fortschrittliche Rechenmethoden zum Kernverhalten bei Reaktivitaetsstoerfaellen. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Pautz, A.; Perin, Y.; Pasichnyk, I.; Velkov, K.; Zwermann, W.; Seubert, A.; Klein, M.; Gallner, L.; Krzycacz-Hausmann, B.

    2012-05-15

    The document at hand serves as the final report for the reactor safety research project RS1183 ''Advanced Computational Methods for the Assessment of Reactor Core Behavior During Reactivity-Initiated Accidents''. The work performed in the framework of this project was dedicated to the development, validation and application of advanced computational methods for the simulation of transients and accidents of nuclear installations. These simulation tools describe in particular the behavior of the reactor core (with respect to neutronics, thermal-hydraulics and thermal mechanics) at a very high level of detail. The overall goal of this project was the deployment of a modern nuclear computational chain which provides, besides advanced 3D tools for coupled neutronics/ thermal-hydraulics full core calculations, also appropriate tools for the generation of multi-group cross sections and Monte Carlo models for the verification of the individual calculational steps. This computational chain shall primarily be deployed for light water reactors (LWR), but should beyond that also be applicable for innovative reactor concepts. Thus, validation on computational benchmarks and critical experiments was of paramount importance. Finally, appropriate methods for uncertainty and sensitivity analysis were to be integrated into the computational framework, in order to assess and quantify the uncertainties due to insufficient knowledge of data, as well as due to methodological aspects.

  5. Quantum computing accelerator I/O : LDRD 52750 final report

    International Nuclear Information System (INIS)

    Schroeppel, Richard Crabtree; Modine, Normand Arthur; Ganti, Anand; Pierson, Lyndon George; Tigges, Christopher P.

    2003-01-01

    In a superposition of quantum states, a bit can be in both the states '0' and '1' at the same time. This feature of the quantum bit or qubit has no parallel in classical systems. Currently, quantum computers consisting of 4 to 7 qubits in a 'quantum computing register' have been built. Innovative algorithms suited to quantum computing are now beginning to emerge, applicable to sorting and cryptanalysis, and other applications. A framework for overcoming slightly inaccurate quantum gate interactions and for causing quantum states to survive interactions with surrounding environment is emerging, called quantum error correction. Thus there is the potential for rapid advances in this field. Although quantum information processing can be applied to secure communication links (quantum cryptography) and to crack conventional cryptosystems, the first few computing applications will likely involve a 'quantum computing accelerator' similar to a 'floating point arithmetic accelerator' interfaced to a conventional Von Neumann computer architecture. This research is to develop a roadmap for applying Sandia's capabilities to the solution of some of the problems associated with maintaining quantum information, and with getting data into and out of such a 'quantum computing accelerator'. We propose to focus this work on 'quantum I/O technologies' by applying quantum optics on semiconductor nanostructures to leverage Sandia's expertise in semiconductor microelectronic/photonic fabrication techniques, as well as its expertise in information theory, processing, and algorithms. The work will be guided by understanding of practical requirements of computing and communication architectures. This effort will incorporate ongoing collaboration between 9000, 6000 and 1000 and between junior and senior personnel. Follow-on work to fabricate and evaluate appropriate experimental nano/microstructures will be proposed as a result of this work

  6. The Activity-Based Computing Project - A Software Architecture for Pervasive Computing Final Report

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    . Special attention should be drawn to publication [25], which gives an overview of the ABC project to the IEEE Pervasive Computing community; the ACM CHI 2006 [19] paper that documents the implementation of the ABC technology; and the ACM ToCHI paper [12], which is the main publication of the project......, documenting all of the project’s four objectives. All of these publication venues are top-tier journals and conferences within computer science. From a business perspective, the project had the objective of incorporating relevant parts of the ABC technology into the products of Medical Insight, which has been...... done. Moreover, partly based on the research done in the ABC project, the company Cetrea A/S has been founded, which incorporate ABC concepts and technologies in its products. The concepts of activity-based computing have also been researched in cooperation with IBM Research, and the ABC project has...

  7. Advanced Certification Program for Computer Graphic Specialists. Final Performance Report.

    Science.gov (United States)

    Parkland Coll., Champaign, IL.

    A pioneer program in computer graphics was implemented at Parkland College (Illinois) to meet the demand for specialized technicians to visualize data generated on high performance computers. In summer 1989, 23 students were accepted into the pilot program. Courses included C programming, calculus and analytic geometry, computer graphics, and…

  8. Application of personal computers to enhance operation and management of research reactors. Proceedings of a final research co-ordination meeting

    International Nuclear Information System (INIS)

    1998-02-01

    The on-line of personal computers (PCs) can be valuable to guide the research reactor operator in analysing both normal and abnormal situations. PCs can effectively be used for data acquisition and data processing, and providing information to the operator. Typical areas of on-line applications of PCs in nuclear research reactors include: Acquisition and display of data on process parameters; performance evaluation of major equipment and safety related components; fuel management; computation of reactor physics parameters; failed fuel detection and location; inventory of system fluids; training using computer aided simulation; operator advice. All these applications require the development of computer programmes and interface hardware. In recognizing this need, the IAEA initiated in 1990 a Co-ordinated Research Programme (CRP) on ''Application of Personal Computers to Enhance Operation and Management of Research Reactors''. The final meeting of the CRP was held from 30 October to 3 November 1995 in Dalata Viet Nam. This report was written by contributors from Bangladesh, Germany, India, the Republic of Korea, Pakistan, Philippines, Thailand and Viet Nam. The IAEA staff members responsible for the publication were K. Akhtar and V. Dimic of the Physics Section, Division of Physical and Chemical Sciences

  9. UOP FIN 571 Final Exam Guide New

    OpenAIRE

    ADMIN

    2018-01-01

    UOP FIN 571 Final Exam Guide New Check this A+ tutorial guideline at http://www.fin571assignment.com/fin-571-uop/fin-571-final-exam-guide -latest For more classes visit http://www.fin571assignment.com Question 1 The underlying assumption of the dividend growth model is that a stock is worth: A. An amount computed as the next annual dividend divided by the required rate of return. B. An amount computed as the next annual dividend divided by the ma...

  10. Quantum Computing and the Limits of the Efficiently Computable

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I'll discuss how computational complexity---the study of what can and can't be feasibly computed---has been interacting with physics in interesting and unexpected ways. I'll first give a crash course about computer science's P vs. NP problem, as well as about the capabilities and limits of quantum computers. I'll then touch on speculative models of computation that would go even beyond quantum computers, using (for example) hypothetical nonlinearities in the Schrodinger equation. Finally, I'll discuss BosonSampling ---a proposal for a simple form of quantum computing, which nevertheless seems intractable to simulate using a classical computer---as well as the role of computational complexity in the black hole information puzzle.

  11. Active system area networks for data intensive computations. Final report

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-04-01

    The goal of the Active System Area Networks (ASAN) project is to develop hardware and software technologies for the implementation of active system area networks (ASANs). The use of the term ''active'' refers to the ability of the network interfaces to perform application-specific as well as system level computations in addition to their traditional role of data transfer. This project adopts the view that the network infrastructure should be an active computational entity capable of supporting certain classes of computations that would otherwise be performed on the host CPUs. The result is a unique network-wide programming model where computations are dynamically placed within the host CPUs or the NIs depending upon the quality of service demands and network/CPU resource availability. The projects seeks to demonstrate that such an approach is a better match for data intensive network-based applications and that the advent of low-cost powerful embedded processors and configurable hardware makes such an approach economically viable and desirable.

  12. Computer Hardware, Advanced Mathematics and Model Physics pilot project final report

    International Nuclear Information System (INIS)

    1992-05-01

    The Computer Hardware, Advanced Mathematics and Model Physics (CHAMMP) Program was launched in January, 1990. A principal objective of the program has been to utilize the emerging capabilities of massively parallel scientific computers in the challenge of regional scale predictions of decade-to-century climate change. CHAMMP has already demonstrated the feasibility of achieving a 10,000 fold increase in computational throughput for climate modeling in this decade. What we have also recognized, however, is the need for new algorithms and computer software to capitalize on the radically new computing architectures. This report describes the pilot CHAMMP projects at the DOE National Laboratories and the National Center for Atmospheric Research (NCAR). The pilot projects were selected to identify the principal challenges to CHAMMP and to entrain new scientific computing expertise. The success of some of these projects has aided in the definition of the CHAMMP scientific plan. Many of the papers in this report have been or will be submitted for publication in the open literature. Readers are urged to consult with the authors directly for questions or comments about their papers

  13. Cognitive Computing for Security.

    Energy Technology Data Exchange (ETDEWEB)

    Debenedictis, Erik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rothganger, Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Aimone, James Bradley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Marinella, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Evans, Brian Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Warrender, Christina E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mickel, Patrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

  14. Computer-assisted modeling: Contributions of computational approaches to elucidating macromolecular structure and function: Final report

    International Nuclear Information System (INIS)

    Walton, S.

    1987-01-01

    The Committee, asked to provide an assessment of computer-assisted modeling of molecular structure, has highlighted the signal successes and the significant limitations for a broad panoply of technologies and has projected plausible paths of development over the next decade. As with any assessment of such scope, differing opinions about present or future prospects were expressed. The conclusions and recommendations, however, represent a consensus of our views of the present status of computational efforts in this field

  15. Does Preinterventional Flat-Panel Computer Tomography Pooled Blood Volume Mapping Predict Final Infarct Volume After Mechanical Thrombectomy in Acute Cerebral Artery Occlusion?

    International Nuclear Information System (INIS)

    Wagner, Marlies; Kyriakou, Yiannis; Mesnil de Rochemont, Richard du; Singer, Oliver C.; Berkefeld, Joachim

    2013-01-01

    PurposeDecreased cerebral blood volume is known to be a predictor for final infarct volume in acute cerebral artery occlusion. To evaluate the predictability of final infarct volume in patients with acute occlusion of the middle cerebral artery (MCA) or the distal internal carotid artery (ICA) and successful endovascular recanalization, pooled blood volume (PBV) was measured using flat-panel detector computed tomography (FPD CT).Materials and MethodsTwenty patients with acute unilateral occlusion of the MCA or distal ACI without demarcated infarction, as proven by CT at admission, and successful Thrombolysis in cerebral infarction score (TICI 2b or 3) endovascular thrombectomy were included. Cerebral PBV maps were acquired from each patient immediately before endovascular thrombectomy. Twenty-four hours after recanalization, each patient underwent multislice CT to visualize final infarct volume. Extent of the areas of decreased PBV was compared with the final infarct volume proven by follow-up CT the next day.ResultsIn 15 of 20 patients, areas of distinct PBV decrease corresponded to final infarct volume. In 5 patients, areas of decreased PBV overestimated final extension of ischemia probably due to inappropriate timing of data acquisition and misery perfusion.ConclusionPBV mapping using FPD CT is a promising tool to predict areas of irrecoverable brain parenchyma in acute thromboembolic stroke. Further validation is necessary before routine use for decision making for interventional thrombectomy

  16. Foundations of Neuromorphic Computing

    Science.gov (United States)

    2013-05-01

    paradigms: few sensors/complex computations and many sensors/simple computation. Challenges with Nano-enabled Neuromorphic Chips A wide variety of...FOUNDATIONS OF NEUROMORPHIC COMPUTING MAY 2013 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...2009 – SEP 2012 4. TITLE AND SUBTITLE FOUNDATIONS OF NEUROMORPHIC COMPUTING 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER N/A 5c. PROGRAM

  17. The Research of the Parallel Computing Development from the Angle of Cloud Computing

    Science.gov (United States)

    Peng, Zhensheng; Gong, Qingge; Duan, Yanyu; Wang, Yun

    2017-10-01

    Cloud computing is the development of parallel computing, distributed computing and grid computing. The development of cloud computing makes parallel computing come into people’s lives. Firstly, this paper expounds the concept of cloud computing and introduces two several traditional parallel programming model. Secondly, it analyzes and studies the principles, advantages and disadvantages of OpenMP, MPI and Map Reduce respectively. Finally, it takes MPI, OpenMP models compared to Map Reduce from the angle of cloud computing. The results of this paper are intended to provide a reference for the development of parallel computing.

  18. Computational infrastructure for law enforcement. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Lades, M.; Kunz, C.; Strikos, I.

    1997-02-01

    This project planned to demonstrate the leverage of enhanced computational infrastructure for law enforcement by demonstrating the face recognition capability at LLNL. The project implemented a face finder module extending the segmentation capabilities of the current face recognition so it was capable of processing different image formats and sizes and create the pilot of a network-accessible image database for the demonstration of face recognition capabilities. The project was funded at $40k (2 man-months) for a feasibility study. It investigated several essential components of a networked face recognition system which could help identify, apprehend, and convict criminals.

  19. Calculation of free-energy differences from computer simulations of initial and final states

    International Nuclear Information System (INIS)

    Hummer, G.; Szabo, A.

    1996-01-01

    A class of simple expressions of increasing accuracy for the free-energy difference between two states is derived based on numerical thermodynamic integration. The implementation of these formulas requires simulations of the initial and final (and possibly a few intermediate) states. They involve higher free-energy derivatives at these states which are related to the moments of the probability distribution of the perturbation. Given a specified number of such derivatives, these integration formulas are optimal in the sense that they are exact to the highest possible order of free-energy perturbation theory. The utility of this approach is illustrated for the hydration free energy of water. This problem provides a quite stringent test because the free energy is a highly nonlinear function of the charge so that even fourth order perturbation theory gives a very poor estimate of the free-energy change. Our results should prove most useful for complex, computationally demanding problems where free-energy differences arise primarily from changes in the electrostatic interactions (e.g., electron transfer, charging of ions, protonation of amino acids in proteins). copyright 1996 American Institute of Physics

  20. HCI in Mobile and Ubiquitous Computing

    OpenAIRE

    椎尾, 一郎; 安村, 通晃; 福本, 雅明; 伊賀, 聡一郎; 増井, 俊之

    2003-01-01

    This paper provides some perspectives to human computer interaction in mobile and ubiquitous computing. The review covers overview of ubiquitous computing, mobile computing and wearable computing. It also summarizes HCI topics on these field, including real-world oriented interface, multi-modal interface, context awareness and in-visible computers. Finally we discuss killer applications for coming ubiquitous computing era.

  1. Results of comparative RBMK neutron computation using VNIIEF codes (cell computation, 3D statics, 3D kinetics). Final report

    Energy Technology Data Exchange (ETDEWEB)

    Grebennikov, A.N.; Zhitnik, A.K.; Zvenigorodskaya, O.A. [and others

    1995-12-31

    In conformity with the protocol of the Workshop under Contract {open_quotes}Assessment of RBMK reactor safety using modern Western Codes{close_quotes} VNIIEF performed a neutronics computation series to compare western and VNIIEF codes and assess whether VNIIEF codes are suitable for RBMK type reactor safety assessment computation. The work was carried out in close collaboration with M.I. Rozhdestvensky and L.M. Podlazov, NIKIET employees. The effort involved: (1) cell computations with the WIMS, EKRAN codes (improved modification of the LOMA code) and the S-90 code (VNIIEF Monte Carlo). Cell, polycell, burnup computation; (2) 3D computation of static states with the KORAT-3D and NEU codes and comparison with results of computation with the NESTLE code (USA). The computations were performed in the geometry and using the neutron constants presented by the American party; (3) 3D computation of neutron kinetics with the KORAT-3D and NEU codes. These computations were performed in two formulations, both being developed in collaboration with NIKIET. Formulation of the first problem maximally possibly agrees with one of NESTLE problems and imitates gas bubble travel through a core. The second problem is a model of the RBMK as a whole with imitation of control and protection system controls (CPS) movement in a core.

  2. Synthetic Computation: Chaos Computing, Logical Stochastic Resonance, and Adaptive Computing

    Science.gov (United States)

    Kia, Behnam; Murali, K.; Jahed Motlagh, Mohammad-Reza; Sinha, Sudeshna; Ditto, William L.

    Nonlinearity and chaos can illustrate numerous behaviors and patterns, and one can select different patterns from this rich library of patterns. In this paper we focus on synthetic computing, a field that engineers and synthesizes nonlinear systems to obtain computation. We explain the importance of nonlinearity, and describe how nonlinear systems can be engineered to perform computation. More specifically, we provide an overview of chaos computing, a field that manually programs chaotic systems to build different types of digital functions. Also we briefly describe logical stochastic resonance (LSR), and then extend the approach of LSR to realize combinational digital logic systems via suitable concatenation of existing logical stochastic resonance blocks. Finally we demonstrate how a chaotic system can be engineered and mated with different machine learning techniques, such as artificial neural networks, random searching, and genetic algorithm, to design different autonomous systems that can adapt and respond to environmental conditions.

  3. Reliability analysis and computation of computer-based safety instrumentation and control used in German nuclear power plant. Final report

    International Nuclear Information System (INIS)

    Ding, Yongjian; Krause, Ulrich; Gu, Chunlei

    2014-01-01

    The trend of technological advancement in the field of safety instrumentation and control (I and C) leads to increasingly frequent use of computer-based (digital) control systems which consisting of distributed, connected bus communications computers and their functionalities are freely programmable by qualified software. The advantages of the new I and C system over the old I and C system with hard-wired technology are e.g. in the higher flexibility, cost-effective procurement of spare parts, higher hardware reliability (through higher integration density, intelligent self-monitoring mechanisms, etc.). On the other hand, skeptics see the new technology with the computer-based I and C a higher potential by influences of common cause failures (CCF), and the easier manipulation by sabotage (IT Security). In this joint research project funded by the Federal Ministry for Economical Affaires and Energy (BMWi) (2011-2014, FJZ 1501405) the Otto-von-Guericke-University Magdeburg and Magdeburg-Stendal University of Applied Sciences are therefore trying to develop suitable methods for the demonstration of the reliability of the new instrumentation and control systems with the focus on the investigation of CCF. This expertise of both houses shall be extended to this area and a scientific contribution to the sound reliability judgments of the digital safety I and C in domestic and foreign nuclear power plants. First, the state of science and technology will be worked out through the study of national and international standards in the field of functional safety of electrical and I and C systems and accompanying literature. On the basis of the existing nuclear Standards the deterministic requirements on the structure of the new digital I and C system will be determined. The possible methods of reliability modeling will be analyzed and compared. A suitable method called multi class binomial failure rate (MCFBR) which was successfully used in safety valve applications will be

  4. Neural computation and the computational theory of cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-04-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism-neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non-neural notions of computation ought to be replaced or reinterpreted in terms of neural computation. Copyright © 2012 Cognitive Science Society, Inc.

  5. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Conrad, Patrick [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Bigoni, Daniele [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Parno, Matthew [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2017-06-09

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a history of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT

  6. Brain Computer Interfaces for Enhanced Interaction with Mobile Robot Agents

    Science.gov (United States)

    2016-07-27

    SECURITY CLASSIFICATION OF: Brain Computer Interfaces (BCIs) show great potential in allowing humans to interact with computational environments in a...Distribution Unlimited UU UU UU UU 27-07-2016 17-Sep-2013 16-Sep-2014 Final Report: Brain Computer Interfaces for Enhanced Interactions with Mobile Robot...published in peer-reviewed journals: Number of Papers published in non peer-reviewed journals: Final Report: Brain Computer Interfaces for Enhanced

  7. Computer in radiology

    International Nuclear Information System (INIS)

    Kuesters, H.

    1985-01-01

    With this publication, the author presents the requirements that a user specific software should fulfill to reach an effective practice rationalisation through computer usage and the hardware configuration necessary as basic equipment. This should make it more difficult in the future for sales representatives to sell radiologists unusable computer systems. Furthermore, questions shall be answered that were asked by computer interested radiologists during the system presentation. On the one hand there still exists a prejudice against programmes of standard texts and on the other side undefined fears, that handling a computer is to difficult and that one has to learn a computer language first to be able to work with computers. Finally, it i pointed out, the real competitive advantages can be obtained through computer usage. (orig.) [de

  8. National Computational Infrastructure for Lattice Gauge Theory: Final report

    International Nuclear Information System (INIS)

    Reed, Daniel A.

    2008-01-01

    In this document we describe work done under the SciDAC-1 Project National Computerational Infrastructure for Lattice Gauge Theory. The objective of this project was to construct the computational infrastructure needed to study quantum chromodynamics (QCD). Nearly all high energy and nuclear physicists in the United States working on the numerical study of QCD are involved in the project, as are Brookhaven National Laboratory (BNL), Fermi National Accelerator Laboratory (FNAL), and Thomas Jefferson National Accelerator Facility (JLab). A list of the senior participants is given in Appendix A.2. The project includes the development of community software for the effective use of the terascale computers, and the research and development of commodity clusters optimized for the study of QCD. The software developed as part of this effort is publicly available, and is being widely used by physicists in the United States and abroad. The prototype clusters built with SciDAC-1 fund have been used to test the software, and are available to lattice gauge theorists in the United States on a peer reviewed basis

  9. Mobile cloud computing for computation offloading: Issues and challenges

    Directory of Open Access Journals (Sweden)

    Khadija Akherfi

    2018-01-01

    Full Text Available Despite the evolution and enhancements that mobile devices have experienced, they are still considered as limited computing devices. Today, users become more demanding and expect to execute computational intensive applications on their smartphone devices. Therefore, Mobile Cloud Computing (MCC integrates mobile computing and Cloud Computing (CC in order to extend capabilities of mobile devices using offloading techniques. Computation offloading tackles limitations of Smart Mobile Devices (SMDs such as limited battery lifetime, limited processing capabilities, and limited storage capacity by offloading the execution and workload to other rich systems with better performance and resources. This paper presents the current offloading frameworks, computation offloading techniques, and analyzes them along with their main critical issues. In addition, it explores different important parameters based on which the frameworks are implemented such as offloading method and level of partitioning. Finally, it summarizes the issues in offloading frameworks in the MCC domain that requires further research.

  10. Hardware for soft computing and soft computing for hardware

    CERN Document Server

    Nedjah, Nadia

    2014-01-01

    Single and Multi-Objective Evolutionary Computation (MOEA),  Genetic Algorithms (GAs), Artificial Neural Networks (ANNs), Fuzzy Controllers (FCs), Particle Swarm Optimization (PSO) and Ant colony Optimization (ACO) are becoming omnipresent in almost every intelligent system design. Unfortunately, the application of the majority of these techniques is complex and so requires a huge computational effort to yield useful and practical results. Therefore, dedicated hardware for evolutionary, neural and fuzzy computation is a key issue for designers. With the spread of reconfigurable hardware such as FPGAs, digital as well as analog hardware implementations of such computation become cost-effective. The idea behind this book is to offer a variety of hardware designs for soft computing techniques that can be embedded in any final product. Also, to introduce the successful application of soft computing technique to solve many hard problem encountered during the design of embedded hardware designs. Reconfigurable em...

  11. Peer-to-peer architectures for exascale computing : LDRD final report.

    Energy Technology Data Exchange (ETDEWEB)

    Vorobeychik, Yevgeniy; Mayo, Jackson R.; Minnich, Ronald G.; Armstrong, Robert C.; Rudish, Donald W.

    2010-09-01

    The goal of this research was to investigate the potential for employing dynamic, decentralized software architectures to achieve reliability in future high-performance computing platforms. These architectures, inspired by peer-to-peer networks such as botnets that already scale to millions of unreliable nodes, hold promise for enabling scientific applications to run usefully on next-generation exascale platforms ({approx} 10{sup 18} operations per second). Traditional parallel programming techniques suffer rapid deterioration of performance scaling with growing platform size, as the work of coping with increasingly frequent failures dominates over useful computation. Our studies suggest that new architectures, in which failures are treated as ubiquitous and their effects are considered as simply another controllable source of error in a scientific computation, can remove such obstacles to exascale computing for certain applications. We have developed a simulation framework, as well as a preliminary implementation in a large-scale emulation environment, for exploration of these 'fault-oblivious computing' approaches. High-performance computing (HPC) faces a fundamental problem of increasing total component failure rates due to increasing system sizes, which threaten to degrade system reliability to an unusable level by the time the exascale range is reached ({approx} 10{sup 18} operations per second, requiring of order millions of processors). As computer scientists seek a way to scale system software for next-generation exascale machines, it is worth considering peer-to-peer (P2P) architectures that are already capable of supporting 10{sup 6}-10{sup 7} unreliable nodes. Exascale platforms will require a different way of looking at systems and software because the machine will likely not be available in its entirety for a meaningful execution time. Realistic estimates of failure rates range from a few times per day to more than once per hour for these

  12. Learning with Ubiquitous Computing

    Science.gov (United States)

    Rosenheck, Louisa

    2008-01-01

    If ubiquitous computing becomes a reality and is widely adopted, it will inevitably have an impact on education. This article reviews the background of ubiquitous computing and current research projects done involving educational "ubicomp." Finally it explores how ubicomp may and may not change education in both formal and informal settings and…

  13. Application of Computer Graphics to Graphing in Algebra and Trigonometry. Final Report.

    Science.gov (United States)

    Morris, J. Richard

    This project was designed to improve the graphing competency of students in elementary algebra, intermediate algebra, and trigonometry courses at Virginia Commonwealth University. Computer graphics programs were designed using an Apple II Plus computer and implemented using Pascal. The software package is interactive and gives students control…

  14. Parallel computers and three-dimensional computational electromagnetics

    International Nuclear Information System (INIS)

    Madsen, N.K.

    1994-01-01

    The authors have continued to enhance their ability to use new massively parallel processing computers to solve time-domain electromagnetic problems. New vectorization techniques have improved the performance of their code DSI3D by factors of 5 to 15, depending on the computer used. New radiation boundary conditions and far-field transformations now allow the computation of radar cross-section values for complex objects. A new parallel-data extraction code has been developed that allows the extraction of data subsets from large problems, which have been run on parallel computers, for subsequent post-processing on workstations with enhanced graphics capabilities. A new charged-particle-pushing version of DSI3D is under development. Finally, DSI3D has become a focal point for several new Cooperative Research and Development Agreement activities with industrial companies such as Lockheed Advanced Development Company, Varian, Hughes Electron Dynamics Division, General Atomic, and Cray

  15. Computer simulation of kinetic properties of plasmas. Final report

    International Nuclear Information System (INIS)

    Denavit, J.

    1982-08-01

    The research was directed toward the development and testing of new numerical methods for particle and hybrid simulation of plasmas, and their application to physical problems of current significance to Magnetic Fusion Energy. This project will terminate on August 31, 1982 and this Final Report describes: (1) the research accomplished since the last renewal on October 1, 1981; and (2) a perspective of the work done since the beginning of the project in February 1972

  16. Development of process control capability through the Browns Ferry Integrated Computer System using Reactor Water Clanup System as an example. Final report

    International Nuclear Information System (INIS)

    Smith, J.; Mowrey, J.

    1995-12-01

    This report describes the design, development and testing of process controls for selected system operations in the Browns Ferry Nuclear Plant (BFNP) Reactor Water Cleanup System (RWCU) using a Computer Simulation Platform which simulates the RWCU System and the BFNP Integrated Computer System (ICS). This system was designed to demonstrate the feasibility of the soft control (video touch screen) of nuclear plant systems through an operator console. The BFNP Integrated Computer System, which has recently. been installed at BFNP Unit 2, was simulated to allow for operator control functions of the modeled RWCU system. The BFNP Unit 2 RWCU system was simulated using the RELAP5 Thermal/Hydraulic Simulation Model, which provided the steady-state and transient RWCU process variables and simulated the response of the system to control system inputs. Descriptions of the hardware and software developed are also included in this report. The testing and acceptance program and results are also detailed in this report. A discussion of potential installation of an actual RWCU process control system in BFNP Unit 2 is included. Finally, this report contains a section on industry issues associated with installation of process control systems in nuclear power plants

  17. XVIS: Visualization for the Extreme-Scale Scientific-Computation Ecosystem Final Scientific/Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States); Maynard, Robert [Kitware, Inc., Clifton Park, NY (United States)

    2017-10-27

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. The XVis project brought together collaborators from predominant DOE projects for visualization on accelerators and combining their respective features into a new visualization toolkit called VTK-m.

  18. Final Report: Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [William Marsh Rice University

    2011-09-13

    As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

  19. New computer systems

    International Nuclear Information System (INIS)

    Faerber, G.

    1975-01-01

    Process computers have already become indespensable technical aids for monitoring and automation tasks in nuclear power stations. Yet there are still some problems connected with their use whose elimination should be the main objective in the development of new computer systems. In the paper, some of these problems are summarized, new tendencies in hardware development are outlined, and finally some new systems concepts made possible by the hardware development are explained. (orig./AK) [de

  20. Computational Science at the Argonne Leadership Computing Facility

    Science.gov (United States)

    Romero, Nichols

    2014-03-01

    The goal of the Argonne Leadership Computing Facility (ALCF) is to extend the frontiers of science by solving problems that require innovative approaches and the largest-scale computing systems. ALCF's most powerful computer - Mira, an IBM Blue Gene/Q system - has nearly one million cores. How does one program such systems? What software tools are available? Which scientific and engineering applications are able to utilize such levels of parallelism? This talk will address these questions and describe a sampling of projects that are using ALCF systems in their research, including ones in nanoscience, materials science, and chemistry. Finally, the ways to gain access to ALCF resources will be presented. This research used resources of the Argonne Leadership Computing Facility at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under contract DE-AC02-06CH11357.

  1. Final Report: Performance Engineering Research Institute

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2014-10-27

    This document is a final report about the work performed for cooperative agreement DE-FC02-06ER25764, the Rice University effort of Performance Engineering Research Institute (PERI). PERI was an Enabling Technologies Institute of the Scientific Discovery through Advanced Computing (SciDAC-2) program supported by the Department of Energy's Office of Science Advanced Scientific Computing Research (ASCR) program. The PERI effort at Rice University focused on (1) research and development of tools for measurement and analysis of application program performance, and (2) engagement with SciDAC-2 application teams.

  2. 2016 Final Reports from the Los Alamos National Laboratory Computational Physics Student Summer Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Runnels, Scott Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bachrach, Harrison Ian [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Carlson, Nils [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Collier, Angela [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dumas, William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Fankell, Douglas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ferris, Natalie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gonzalez, Francisco [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Griffith, Alec [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Guston, Brandon [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kenyon, Connor [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Li, Benson [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mookerjee, Adaleena [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parkinson, Christian [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Peck, Hailee [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Peters, Evan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Poondla, Yasvanth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rogers, Brandon [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Shaffer, Nathaniel [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Trettel, Andrew [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Valaitis, Sonata Mae [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Venzke, Joel Aaron [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Black, Mason [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Demircan, Samet [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Holladay, Robert Tyler [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-22

    The two primary purposes of LANL’s Computational Physics Student Summer Workshop are (1) To educate graduate and exceptional undergraduate students in the challenges and applications of computational physics of interest to LANL, and (2) Entice their interest toward those challenges. Computational physics is emerging as a discipline in its own right, combining expertise in mathematics, physics, and computer science. The mathematical aspects focus on numerical methods for solving equations on the computer as well as developing test problems with analytical solutions. The physics aspects are very broad, ranging from low-temperature material modeling to extremely high temperature plasma physics, radiation transport and neutron transport. The computer science issues are concerned with matching numerical algorithms to emerging architectures and maintaining the quality of extremely large codes built to perform multi-physics calculations. Although graduate programs associated with computational physics are emerging, it is apparent that the pool of U.S. citizens in this multi-disciplinary field is relatively small and is typically not focused on the aspects that are of primary interest to LANL. Furthermore, more structured foundations for LANL interaction with universities in computational physics is needed; historically interactions rely heavily on individuals’ personalities and personal contacts. Thus a tertiary purpose of the Summer Workshop is to build an educational network of LANL researchers, university professors, and emerging students to advance the field and LANL’s involvement in it.

  3. Final Report. Center for Scalable Application Development Software

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2014-10-26

    The Center for Scalable Application Development Software (CScADS) was established as a part- nership between Rice University, Argonne National Laboratory, University of California Berkeley, University of Tennessee – Knoxville, and University of Wisconsin – Madison. CScADS pursued an integrated set of activities with the aim of increasing the productivity of DOE computational scientists by catalyzing the development of systems software, libraries, compilers, and tools for leadership computing platforms. Principal Center activities were workshops to engage the research community in the challenges of leadership computing, research and development of open-source software, and work with computational scientists to help them develop codes for leadership computing platforms. This final report summarizes CScADS activities at Rice University in these areas.

  4. DOE Utility Matching Program Final Technical Report

    International Nuclear Information System (INIS)

    Haghighat, Alireza

    2002-01-01

    This is the Final report for the DOE Match Grant (DE-FG02-99NE38163) awarded to the Nuclear and Radiological Engineering (NRE) Department, University of Florida, for the period of September 1999 to January 2002. This grant has been instrumental for maintaining high-quality graduate and undergraduate education at the NRE department. The grant has been used for supporting student entry and retention and for upgrading nuclear educational facilities, nuclear instrumentation, computer facilities, and computer codes to better enable the incorporation of experimental experiences and computer simulations related to advanced light water fission reactor engineering and other advanced reactor concepts into the nuclear engineering course curricula

  5. Recent computational chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Onishi, Taku [Department of Chemistry for Materials, and The Center of Ultimate Technology on nano-Electronics, Mie University (Japan); Center for Theoretical and Computational Chemistry, Department of Chemistry, University of Oslo (Norway)

    2015-12-31

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced.

  6. Recent computational chemistry

    International Nuclear Information System (INIS)

    Onishi, Taku

    2015-01-01

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced

  7. 2015 Final Reports from the Los Alamos National Laboratory Computational Physics Student Summer Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Runnels, Scott Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Caldwell, Wendy [Arizona State Univ., Mesa, AZ (United States); Brown, Barton Jed [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pederson, Clark [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Justin [Univ. of California, Santa Cruz, CA (United States); Burrill, Daniel [Univ. of Vermont, Burlington, VT (United States); Feinblum, David [Univ. of California, Irvine, CA (United States); Hyde, David [SLAC National Accelerator Lab., Menlo Park, CA (United States). Stanford Institute for Materials and Energy Science (SIMES); Levick, Nathan [Univ. of New Mexico, Albuquerque, NM (United States); Lyngaas, Isaac [Florida State Univ., Tallahassee, FL (United States); Maeng, Brad [Univ. of Michigan, Ann Arbor, MI (United States); Reed, Richard LeRoy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sarno-Smith, Lois [Univ. of Michigan, Ann Arbor, MI (United States); Shohet, Gil [Univ. of Illinois, Urbana-Champaign, IL (United States); Skarda, Jinhie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Stevens, Josey [Missouri Univ. of Science and Technology, Rolla, MO (United States); Zeppetello, Lucas [Columbia Univ., New York, NY (United States); Grossman-Ponemon, Benjamin [Stanford Univ., CA (United States); Bottini, Joseph Larkin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Loudon, Tyson Shane [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); VanGessel, Francis Gilbert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Nagaraj, Sriram [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Price, Jacob [Univ. of Washington, Seattle, WA (United States)

    2015-10-15

    The two primary purposes of LANL’s Computational Physics Student Summer Workshop are (1) To educate graduate and exceptional undergraduate students in the challenges and applications of computational physics of interest to LANL, and (2) Entice their interest toward those challenges. Computational physics is emerging as a discipline in its own right, combining expertise in mathematics, physics, and computer science. The mathematical aspects focus on numerical methods for solving equations on the computer as well as developing test problems with analytical solutions. The physics aspects are very broad, ranging from low-temperature material modeling to extremely high temperature plasma physics, radiation transport and neutron transport. The computer science issues are concerned with matching numerical algorithms to emerging architectures and maintaining the quality of extremely large codes built to perform multi-physics calculations. Although graduate programs associated with computational physics are emerging, it is apparent that the pool of U.S. citizens in this multi-disciplinary field is relatively small and is typically not focused on the aspects that are of primary interest to LANL. Furthermore, more structured foundations for LANL interaction with universities in computational physics is needed; historically interactions rely heavily on individuals’ personalities and personal contacts. Thus a tertiary purpose of the Summer Workshop is to build an educational network of LANL researchers, university professors, and emerging students to advance the field and LANL’s involvement in it. This report includes both the background for the program and the reports from the students.

  8. Computational models of neuromodulation.

    Science.gov (United States)

    Fellous, J M; Linster, C

    1998-05-15

    Computational modeling of neural substrates provides an excellent theoretical framework for the understanding of the computational roles of neuromodulation. In this review, we illustrate, with a large number of modeling studies, the specific computations performed by neuromodulation in the context of various neural models of invertebrate and vertebrate preparations. We base our characterization of neuromodulations on their computational and functional roles rather than on anatomical or chemical criteria. We review the main framework in which neuromodulation has been studied theoretically (central pattern generation and oscillations, sensory processing, memory and information integration). Finally, we present a detailed mathematical overview of how neuromodulation has been implemented at the single cell and network levels in modeling studies. Overall, neuromodulation is found to increase and control computational complexity.

  9. CAD-centric Computation Management System for a Virtual TBM. Final Report

    International Nuclear Information System (INIS)

    Munipalli, Ramakanth; Szema, K.Y.; Huang, P.Y.; Rowell, C.M.; Ying, A.; Abdou, M.

    2011-01-01

    HyPerComp Inc. in research collaboration with TEXCEL has set out to build a Virtual Test Blanket Module (VTBM) computational system to address the need in contemporary fusion research for simulating the integrated behavior of the blanket, divertor and plasma facing components in a fusion environment. Physical phenomena to be considered in a VTBM will include fluid flow, heat transfer, mass transfer, neutronics, structural mechanics and electromagnetics. We seek to integrate well established (third-party) simulation software in various disciplines mentioned above. The integrated modeling process will enable user groups to interoperate using a common modeling platform at various stages of the analysis. Since CAD is at the core of the simulation (as opposed to computational meshes which are different for each problem,) VTBM will have a well developed CAD interface, governing CAD model editing, cleanup, parameter extraction, model deformation (based on simulation,) CAD-based data interpolation. In Phase-I, we built the CAD-hub of the proposed VTBM and demonstrated its use in modeling a liquid breeder blanket module with coupled MHD and structural mechanics using HIMAG and ANSYS. A complete graphical user interface of the VTBM was created, which will form the foundation of any future development. Conservative data interpolation via CAD (as opposed to mesh-based transfer), the regeneration of CAD models based upon computed deflections, are among the other highlights of phase-I activity.

  10. Models of optical quantum computing

    Directory of Open Access Journals (Sweden)

    Krovi Hari

    2017-03-01

    Full Text Available I review some work on models of quantum computing, optical implementations of these models, as well as the associated computational power. In particular, we discuss the circuit model and cluster state implementations using quantum optics with various encodings such as dual rail encoding, Gottesman-Kitaev-Preskill encoding, and coherent state encoding. Then we discuss intermediate models of optical computing such as boson sampling and its variants. Finally, we review some recent work in optical implementations of adiabatic quantum computing and analog optical computing. We also provide a brief description of the relevant aspects from complexity theory needed to understand the results surveyed.

  11. Improved Barriers to Turbine Engine Fragments: Final Annual Report

    National Research Council Canada - National Science Library

    Shockey, Donald

    2002-01-01

    This final annual technical report describes the progress rnade during year 4 of the SPI International Phase II effort to develop a computational capability for designing lightweight fragment barriers...

  12. Intercomparison of liquid metal fast reactor seismic analysis codes. V. 3: Comparison of observed effects with computer simulated effects on reactor cores from seismic disturbances. Proceedings of a final research co-ordination meeting

    International Nuclear Information System (INIS)

    1996-05-01

    This publication contains the final papers summarizing the validation of the codes on the basis of comparison of observed effects with computer simulated effects on reactor cores from seismic disturbances. Refs, figs tabs

  13. AIMES Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Katz, Daniel S [Univ. of Illinois, Urbana-Champaign, IL (United States). National Center for Supercomputing Applications (NCSA); Jha, Shantenu [Rutgers Univ., New Brunswick, NJ (United States); Weissman, Jon [Univ. of Minnesota, Minneapolis, MN (United States); Turilli, Matteo [Rutgers Univ., New Brunswick, NJ (United States)

    2017-01-31

    This is the final technical report for the AIMES project. Many important advances in science and engineering are due to large-scale distributed computing. Notwithstanding this reliance, we are still learning how to design and deploy large-scale production Distributed Computing Infrastructures (DCI). This is evidenced by missing design principles for DCI, and an absence of generally acceptable and usable distributed computing abstractions. The AIMES project was conceived against this backdrop, following on the heels of a comprehensive survey of scientific distributed applications. AIMES laid the foundations to address the tripartite challenge of dynamic resource management, integrating information, and portable and interoperable distributed applications. Four abstractions were defined and implemented: skeleton, resource bundle, pilot, and execution strategy. The four abstractions were implemented into software modules and then aggregated into the AIMES middleware. This middleware successfully integrates information across the application layer (skeletons) and resource layer (Bundles), derives a suitable execution strategy for the given skeleton and enacts its execution by means of pilots on one or more resources, depending on the application requirements, and resource availabilities and capabilities.

  14. Quantum computational webs

    International Nuclear Information System (INIS)

    Gross, D.; Eisert, J.

    2010-01-01

    We discuss the notion of quantum computational webs: These are quantum states universal for measurement-based computation, which can be built up from a collection of simple primitives. The primitive elements--reminiscent of building blocks in a construction kit--are (i) one-dimensional states (computational quantum wires) with the power to process one logical qubit and (ii) suitable couplings, which connect the wires to a computationally universal web. All elements are preparable by nearest-neighbor interactions in a single pass, of the kind accessible in a number of physical architectures. We provide a complete classification of qubit wires, a physically well-motivated class of universal resources that can be fully understood. Finally, we sketch possible realizations in superlattices and explore the power of coupling mechanisms based on Ising or exchange interactions.

  15. The NIRA computer program package (photonuclear data center). Final report

    International Nuclear Information System (INIS)

    Vander Molen, H.J.; Gerstenberg, H.M.

    1976-02-01

    The Photonuclear Data Center's NIRA library of programs, executable from mass storage on the National Bureau of Standard's central computer facility, is described. Detailed instructions are given (with examples) for the use of the library to analyze, evaluate, synthesize, and produce for publication camera-ready tabular and graphical presentations of digital photonuclear reaction cross-section data. NIRA is the acronym for Nuclear Information Research Associate

  16. A primer on the energy efficiency of computing

    Energy Technology Data Exchange (ETDEWEB)

    Koomey, Jonathan G. [Research Fellow, Steyer-Taylor Center for Energy Policy and Finance, Stanford University (United States)

    2015-03-30

    The efficiency of computing at peak output has increased rapidly since the dawn of the computer age. This paper summarizes some of the key factors affecting the efficiency of computing in all usage modes. While there is still great potential for improving the efficiency of computing devices, we will need to alter how we do computing in the next few decades because we are finally approaching the limits of current technologies.

  17. Computational invariant theory

    CERN Document Server

    Derksen, Harm

    2015-01-01

    This book is about the computational aspects of invariant theory. Of central interest is the question how the invariant ring of a given group action can be calculated. Algorithms for this purpose form the main pillars around which the book is built. There are two introductory chapters, one on Gröbner basis methods and one on the basic concepts of invariant theory, which prepare the ground for the algorithms. Then algorithms for computing invariants of finite and reductive groups are discussed. Particular emphasis lies on interrelations between structural properties of invariant rings and computational methods. Finally, the book contains a chapter on applications of invariant theory, covering fields as disparate as graph theory, coding theory, dynamical systems, and computer vision. The book is intended for postgraduate students as well as researchers in geometry, computer algebra, and, of course, invariant theory. The text is enriched with numerous explicit examples which illustrate the theory and should be ...

  18. CSNS computing environment Based on OpenStack

    Science.gov (United States)

    Li, Yakang; Qi, Fazhi; Chen, Gang; Wang, Yanming; Hong, Jianshu

    2017-10-01

    Cloud computing can allow for more flexible configuration of IT resources and optimized hardware utilization, it also can provide computing service according to the real need. We are applying this computing mode to the China Spallation Neutron Source(CSNS) computing environment. So, firstly, CSNS experiment and its computing scenarios and requirements are introduced in this paper. Secondly, the design and practice of cloud computing platform based on OpenStack are mainly demonstrated from the aspects of cloud computing system framework, network, storage and so on. Thirdly, some improvments to openstack we made are discussed further. Finally, current status of CSNS cloud computing environment are summarized in the ending of this paper.

  19. Videoprocessing with the MSX-computer

    International Nuclear Information System (INIS)

    Vliet, G.J. van.

    1988-01-01

    This report deals with the processing of video images with a Philips MSX-2 computer and is directed specifically onto the processing of the videosignals of the beamviewers. The final purpose is to create an extra control function which may be used for intuning the beam. This control function is established by mixing the video signals with a reference image from the computer. 7 figs

  20. WAMCUT, a computer code for fault tree evaluation. Final report

    International Nuclear Information System (INIS)

    Erdmann, R.C.

    1978-06-01

    WAMCUT is a code in the WAM family which produces the minimum cut sets (MCS) for a given fault tree. The MCS are useful as they provide a qualitative evaluation of a system, as well as providing a means of determining the probability distribution function for the top of the tree. The program is very efficient and will produce all the MCS in a very short computer time span. 22 figures, 4 tables

  1. Computer aided product design

    DEFF Research Database (Denmark)

    Constantinou, Leonidas; Bagherpour, Khosrow; Gani, Rafiqul

    1996-01-01

    A general methodology for Computer Aided Product Design (CAPD) with specified property constraints which is capable of solving a large range of problems is presented. The methodology employs the group contribution approach, generates acyclic, cyclic and aromatic compounds of various degrees......-liquid equilibria (LLE), solid-liquid equilibria (SLE) and gas solubility. Finally, a computer program based on the extended methodology has been developed and the results from five case studies highlighting various features of the methodology are presented....

  2. Automated, computer interpreted radioimmunoassay results

    International Nuclear Information System (INIS)

    Hill, J.C.; Nagle, C.E.; Dworkin, H.J.; Fink-Bennett, D.; Freitas, J.E.; Wetzel, R.; Sawyer, N.; Ferry, D.; Hershberger, D.

    1984-01-01

    90,000 Radioimmunoassay results have been interpreted and transcribed automatically using software developed for use on a Hewlett Packard Model 1000 mini-computer system with conventional dot matrix printers. The computer program correlates the results of a combination of assays, interprets them and prints a report ready for physician review and signature within minutes of completion of the assay. The authors designed and wrote a computer program to query their patient data base for radioassay laboratory results and to produce a computer generated interpretation of these results using an algorithm that produces normal and abnormal interpretives. Their laboratory assays 50,000 patient samples each year using 28 different radioassays. Of these 85% have been interpreted using our computer program. Allowances are made for drug and patient history and individualized reports are generated with regard to the patients age and sex. Finalization of reports is still subject to change by the nuclear physician at the time of final review. Automated, computerized interpretations have realized cost savings through reduced personnel and personnel time and provided uniformity of the interpretations among the five physicians. Prior to computerization of interpretations, all radioassay results had to be dictated and reviewed for signing by one of the resident or staff physicians. Turn around times for reports prior to the automated computer program generally were two to three days. Whereas, the computerized interpret system allows reports to generally be issued the day assays are completed

  3. Tracking the PhD Students' Daily Computer Use

    Science.gov (United States)

    Sim, Kwong Nui; van der Meer, Jacques

    2015-01-01

    This study investigated PhD students' computer activities in their daily research practice. Software that tracks computer usage (Manic Time) was installed on the computers of nine PhD students, who were at their early, mid and final stage in doing their doctoral research in four different discipline areas (Commerce, Humanities, Health Sciences and…

  4. Publication and Retrieval of Computational Chemical-Physical Data Via the Semantic Web. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Ostlund, Neil [Chemical Semantics, Inc., Gainesville, FL (United States)

    2017-07-20

    This research showed the feasibility of applying the concepts of the Semantic Web to Computation Chemistry. We have created the first web portal (www.chemsem.com) that allows data created in the calculations of quantum chemistry, and other such chemistry calculations to be placed on the web in a way that makes the data accessible to scientists in a semantic form never before possible. The semantic web nature of the portal allows data to be searched, found, and used as an advance over the usual approach of a relational database. The semantic data on our portal has the nature of a Giant Global Graph (GGG) that can be easily merged with related data and searched globally via a SPARQL Protocol and RDF Query Language (SPARQL) that makes global searches for data easier than with traditional methods. Our Semantic Web Portal requires that the data be understood by a computer and hence defined by an ontology (vocabulary). This ontology is used by the computer in understanding the data. We have created such an ontology for computational chemistry (purl.org/gc) that encapsulates a broad knowledge of the field of computational chemistry. We refer to this ontology as the Gainesville Core. While it is perhaps the first ontology for computational chemistry and is used by our portal, it is only a start of what must be a long multi-partner effort to define computational chemistry. In conjunction with the above efforts we have defined a new potential file standard (Common Standard for eXchange – CSX for computational chemistry data). This CSX file is the precursor of data in the Resource Description Framework (RDF) form that the semantic web requires. Our portal translates CSX files (as well as other computational chemistry data files) into RDF files that are part of the graph database that the semantic web employs. We propose a CSX file as a convenient way to encapsulate computational chemistry data.

  5. Partnership in Computational Science

    Energy Technology Data Exchange (ETDEWEB)

    Huray, Paul G.

    1999-02-24

    This is the final report for the "Partnership in Computational Science" (PICS) award in an amount of $500,000 for the period January 1, 1993 through December 31, 1993. A copy of the proposal with its budget is attached as Appendix A. This report first describes the consequent significance of the DOE award in building infrastructure of high performance computing in the Southeast and then describes the work accomplished under this grant and a list of publications resulting from it.

  6. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    Science.gov (United States)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  7. Improvement of measurements, theoretical computations and evaluations of neutron induced helium production cross sections. Summary report on the third and final research co-ordination meeting

    International Nuclear Information System (INIS)

    Pashchenko, A.B.

    1996-09-01

    The present report contains the Summary of the Third and Final IAEA Research Co-ordination Meeting (RCM) on ''Improvement of Measurements, Theoretical Computations and Evaluations of Neutron Induced Helium Production Cross Sections'' which was hosted by the Tohoku University and held in Sendai, Japan, from 25 to 29 September 1995. This RCM was organized by the IAEA Nuclear Data Section (NDS), with the co-operation and assistance of local organizers from Tohoku University. Summarized are the proceedings and results of the meeting. The List of Participants and meeting Agenda are included. (author)

  8. Using Computers in Fluids Engineering Education

    Science.gov (United States)

    Benson, Thomas J.

    1998-01-01

    Three approaches for using computers to improve basic fluids engineering education are presented. The use of computational fluid dynamics solutions to fundamental flow problems is discussed. The use of interactive, highly graphical software which operates on either a modern workstation or personal computer is highlighted. And finally, the development of 'textbooks' and teaching aids which are used and distributed on the World Wide Web is described. Arguments for and against this technology as applied to undergraduate education are also discussed.

  9. Tracing monadic computations and representing effects

    Directory of Open Access Journals (Sweden)

    Maciej Piróg

    2012-02-01

    Full Text Available In functional programming, monads are supposed to encapsulate computations, effectfully producing the final result, but keeping to themselves the means of acquiring it. For various reasons, we sometimes want to reveal the internals of a computation. To make that possible, in this paper we introduce monad transformers that add the ability to automatically accumulate observations about the course of execution as an effect. We discover that if we treat the resulting trace as the actual result of the computation, we can find new functionality in existing monads, notably when working with non-terminating computations.

  10. Computation cluster for Monte Carlo calculations

    Energy Technology Data Exchange (ETDEWEB)

    Petriska, M.; Vitazek, K.; Farkas, G.; Stacho, M.; Michalek, S. [Dep. Of Nuclear Physics and Technology, Faculty of Electrical Engineering and Information, Technology, Slovak Technical University, Ilkovicova 3, 81219 Bratislava (Slovakia)

    2010-07-01

    Two computation clusters based on Rocks Clusters 5.1 Linux distribution with Intel Core Duo and Intel Core Quad based computers were made at the Department of the Nuclear Physics and Technology. Clusters were used for Monte Carlo calculations, specifically for MCNP calculations applied in Nuclear reactor core simulations. Optimization for computation speed was made on hardware and software basis. Hardware cluster parameters, such as size of the memory, network speed, CPU speed, number of processors per computation, number of processors in one computer were tested for shortening the calculation time. For software optimization, different Fortran compilers, MPI implementations and CPU multi-core libraries were tested. Finally computer cluster was used in finding the weighting functions of neutron ex-core detectors of VVER-440. (authors)

  11. Computation cluster for Monte Carlo calculations

    International Nuclear Information System (INIS)

    Petriska, M.; Vitazek, K.; Farkas, G.; Stacho, M.; Michalek, S.

    2010-01-01

    Two computation clusters based on Rocks Clusters 5.1 Linux distribution with Intel Core Duo and Intel Core Quad based computers were made at the Department of the Nuclear Physics and Technology. Clusters were used for Monte Carlo calculations, specifically for MCNP calculations applied in Nuclear reactor core simulations. Optimization for computation speed was made on hardware and software basis. Hardware cluster parameters, such as size of the memory, network speed, CPU speed, number of processors per computation, number of processors in one computer were tested for shortening the calculation time. For software optimization, different Fortran compilers, MPI implementations and CPU multi-core libraries were tested. Finally computer cluster was used in finding the weighting functions of neutron ex-core detectors of VVER-440. (authors)

  12. Inleiding: 'History of computing'. Geschiedschrijving over computers en computergebruik in Nederland

    Directory of Open Access Journals (Sweden)

    Adrienne van den Boogaard

    2008-06-01

    Full Text Available Along with the international trends in history of computing, Dutch contributions over the past twenty years moved away from a focus on machinery to the broader scope of use of computers, appropriation of computing technologies in various traditions, labour relations and professionalisation issues, and, lately, software.It is only natural that an emerging field like computer science sets out to write its genealogy and canonise the important steps in its intellectual endeavour. It is fair to say that a historiography diverging from such “home” interest, started in 1987 with the work of Eda Kranakis – then active in The Netherlands – commissioned by the national bureau for technology assessment, and Gerard Alberts, turning a commemorative volume of the Mathematical Center into a history of the same institute. History of computing in The Netherlands made a major leap in the spring of 1994 when Dirk de Wit, Jan van den Ende and Ellen van Oost defended their dissertations, on the roads towards adoption of computing technology in banking, in science and engineering, and on the gender aspect in computing. Here, history of computing had already moved from machines to the use of computers. The three authors joined Gerard Alberts and Onno de Wit in preparing a volume on the rise of IT in The Netherlands, the sequel of which in now in preparation in a team lead by Adrienne van den Bogaard.Dutch research reflected the international attention for professionalisation issues (Ensmenger, Haigh very early on in the dissertation by Ruud van Dael, Something to do with computers (2001 revealing how occupations dealing with computers typically escape the pattern of closure by professionalisation as expected by the, thus outdated, sociology of professions. History of computing not only takes use and users into consideration, but finally, as one may say, confronts the technological side of putting the machine to use, software, head on. The groundbreaking works

  13. Essentials of Computational Electromagnetics

    CERN Document Server

    Sheng, Xin-Qing

    2012-01-01

    Essentials of Computational Electromagnetics provides an in-depth introduction of the three main full-wave numerical methods in computational electromagnetics (CEM); namely, the method of moment (MoM), the finite element method (FEM), and the finite-difference time-domain (FDTD) method. Numerous monographs can be found addressing one of the above three methods. However, few give a broad general overview of essentials embodied in these methods, or were published too early to include recent advances. Furthermore, many existing monographs only present the final numerical results without specifyin

  14. Navier-Stokes computer

    International Nuclear Information System (INIS)

    Hayder, M.E.

    1988-01-01

    A new scientific supercomputer, known as the Navier-Stokes Computer (NSC), has been designed. The NSC is a multi-purpose machine, and for applications in the field of computational fluid dynamics (CFD), this supercomputer is expected to yield a computational speed far exceeding that of the present-day super computers. This computer has a few very powerful processors (known as nodes) connected by an internodal network. There are three versions of the NSC nodes: micro-, mini- and full-node. The micro-node was developed to prove, to demonstrate and to refine the key architectural features of the NSC. Architectures of the two recent versions of the NSC nodes are presented, with the main focus on the full-node. At a clock speed of 20 MHz, the mini- and the full-node have peak computational speeds of 200 and 640 MFLOPS, respectively. The full-node is the final version for the NSC nodes and an NSC is expected to have 128 full-nodes. To test the suitability of different algorithms on the NSC architecture, an NSC simulator was developed. Some of the existing computational fluid dynamics codes were placed on this simulator to determine important and relevant issues relating to the efficient use of the NSC architecture

  15. Coordinated Fault-Tolerance for High-Performance Computing Final Project Report

    Energy Technology Data Exchange (ETDEWEB)

    Panda, Dhabaleswar Kumar [The Ohio State University; Beckman, Pete

    2011-07-28

    existing publish-subscribe tools. We enhanced the intrinsic fault tolerance capabilities representative implementations of a variety of key HPC software subsystems and integrated them with the FTB. Targeting software subsystems included: MPI communication libraries, checkpoint/restart libraries, resource managers and job schedulers, and system monitoring tools. Leveraging the aforementioned infrastructure, as well as developing and utilizing additional tools, we have examined issues associated with expanded, end-to-end fault response from both system and application viewpoints. From the standpoint of system operations, we have investigated log and root cause analysis, anomaly detection and fault prediction, and generalized notification mechanisms. Our applications work has included libraries for fault-tolerance linear algebra, application frameworks for coupled multiphysics applications, and external frameworks to support the monitoring and response for general applications. Our final goal was to engage the high-end computing community to increase awareness of tools and issues around coordinated end-to-end fault management.

  16. Final Technical Progress Report; Closeout Certifications; CSSV Newsletter Volume I; CSSV Newsletter Volume II; CSSV Activity Journal; CSSV Final Financial Report

    Energy Technology Data Exchange (ETDEWEB)

    Houston, Johnny L [PI; Geter, Kerry [Division of Business and Finance

    2013-08-23

    This Project?s third year of implementation in 2007-2008, the final year, as designated by Elizabeth City State University (ECSU), in cooperation with the National Association of Mathematicians (NAM) Inc., in an effort to promote research and research training programs in computational science ? scientific visualization (CSSV). A major goal of the Project was to attract the energetic and productive faculty, graduate and upper division undergraduate students of diverse ethnicities to a program that investigates science and computational science issues of long-term interest to the Department of Energy (DoE) and the nation. The breadth and depth of computational science?scientific visualization and the magnitude of resources available are enormous for permitting a variety of research activities. ECSU?s Computational Science-Science Visualization Center will serve as a conduit for directing users to these enormous resources.

  17. Havery Mudd 2014-2015 Computer Science Conduit Clinic Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Aspesi, G [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bai, J [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Deese, R [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Shin, L [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-12

    Conduit, a new open-source library developed at Lawrence Livermore National Laboratories, provides a C++ application programming interface (API) to describe and access scientific data. Conduit’s primary use is for inmemory data exchange in high performance computing (HPC) applications. Our team tested and improved Conduit to make it more appealing to potential adopters in the HPC community. We extended Conduit’s capabilities by prototyping four libraries: one for parallel communication using MPI, one for I/O functionality, one for aggregating performance data, and one for data visualization.

  18. A programming approach to computability

    CERN Document Server

    Kfoury, A J; Arbib, Michael A

    1982-01-01

    Computability theory is at the heart of theoretical computer science. Yet, ironically, many of its basic results were discovered by mathematical logicians prior to the development of the first stored-program computer. As a result, many texts on computability theory strike today's computer science students as far removed from their concerns. To remedy this, we base our approach to computability on the language of while-programs, a lean subset of PASCAL, and postpone consideration of such classic models as Turing machines, string-rewriting systems, and p. -recursive functions till the final chapter. Moreover, we balance the presentation of un solvability results such as the unsolvability of the Halting Problem with a presentation of the positive results of modern programming methodology, including the use of proof rules, and the denotational semantics of programs. Computer science seeks to provide a scientific basis for the study of information processing, the solution of problems by algorithms, and the design ...

  19. When cloud computing meets bioinformatics: a review.

    Science.gov (United States)

    Zhou, Shuigeng; Liao, Ruiqi; Guan, Jihong

    2013-10-01

    In the past decades, with the rapid development of high-throughput technologies, biology research has generated an unprecedented amount of data. In order to store and process such a great amount of data, cloud computing and MapReduce were applied to many fields of bioinformatics. In this paper, we first introduce the basic concepts of cloud computing and MapReduce, and their applications in bioinformatics. We then highlight some problems challenging the applications of cloud computing and MapReduce to bioinformatics. Finally, we give a brief guideline for using cloud computing in biology research.

  20. Touchable Computing: Computing-Inspired Bio-Detection.

    Science.gov (United States)

    Chen, Yifan; Shi, Shaolong; Yao, Xin; Nakano, Tadashi

    2017-12-01

    We propose a new computing-inspired bio-detection framework called touchable computing (TouchComp). Under the rubric of TouchComp, the best solution is the cancer to be detected, the parameter space is the tissue region at high risk of malignancy, and the agents are the nanorobots loaded with contrast medium molecules for tracking purpose. Subsequently, the cancer detection procedure (CDP) can be interpreted from the computational optimization perspective: a population of externally steerable agents (i.e., nanorobots) locate the optimal solution (i.e., cancer) by moving through the parameter space (i.e., tissue under screening), whose landscape (i.e., a prescribed feature of tissue environment) may be altered by these agents but the location of the best solution remains unchanged. One can then infer the landscape by observing the movement of agents by applying the "seeing-is-sensing" principle. The term "touchable" emphasizes the framework's similarity to controlling by touching the screen with a finger, where the external field for controlling and tracking acts as the finger. Given this analogy, we aim to answer the following profound question: can we look to the fertile field of computational optimization algorithms for solutions to achieve effective cancer detection that are fast, accurate, and robust? Along this line of thought, we consider the classical particle swarm optimization (PSO) as an example and propose the PSO-inspired CDP, which differs from the standard PSO by taking into account realistic in vivo propagation and controlling of nanorobots. Finally, we present comprehensive numerical examples to demonstrate the effectiveness of the PSO-inspired CDP for different blood flow velocity profiles caused by tumor-induced angiogenesis. The proposed TouchComp bio-detection framework may be regarded as one form of natural computing that employs natural materials to compute.

  1. IBM Cloud Computing Powering a Smarter Planet

    Science.gov (United States)

    Zhu, Jinzy; Fang, Xing; Guo, Zhe; Niu, Meng Hua; Cao, Fan; Yue, Shuang; Liu, Qin Yu

    With increasing need for intelligent systems supporting the world's businesses, Cloud Computing has emerged as a dominant trend to provide a dynamic infrastructure to make such intelligence possible. The article introduced how to build a smarter planet with cloud computing technology. First, it introduced why we need cloud, and the evolution of cloud technology. Secondly, it analyzed the value of cloud computing and how to apply cloud technology. Finally, it predicted the future of cloud in the smarter planet.

  2. Green Cloud Computing: A Literature Survey

    Directory of Open Access Journals (Sweden)

    Laura-Diana Radu

    2017-11-01

    Full Text Available Cloud computing is a dynamic field of information and communication technologies (ICTs, introducing new challenges for environmental protection. Cloud computing technologies have a variety of application domains, since they offer scalability, are reliable and trustworthy, and offer high performance at relatively low cost. The cloud computing revolution is redesigning modern networking, and offering promising environmental protection prospects as well as economic and technological advantages. These technologies have the potential to improve energy efficiency and to reduce carbon footprints and (e-waste. These features can transform cloud computing into green cloud computing. In this survey, we review the main achievements of green cloud computing. First, an overview of cloud computing is given. Then, recent studies and developments are summarized, and environmental issues are specifically addressed. Finally, future research directions and open problems regarding green cloud computing are presented. This survey is intended to serve as up-to-date guidance for research with respect to green cloud computing.

  3. Active and passive computed tomography mixed waste focus area final report

    International Nuclear Information System (INIS)

    Becker, G K; Camp, D C; Decman, D J; Jackson, J A; Martz, H E; Roberson, G P.

    1998-01-01

    The Mixed Waste Focus Area (MWFA) Characterization Development Strategy delineates an approach to resolve technology deficiencies associated with the characterization of mixed wastes. The intent of this strategy is to ensure the availability of technologies to support the Department of Energy s (DOE) mixed-waste, low-level or transuranic (TRU) contaminated waste characterization management needs. To this end the MWFA has defined and coordinated characterization development programs to ensure that data and test results necessary to evaluate the utility of non-destructive assay technologies are available to meet site contact handled waste management schedules. Requirements used as technology development project benchmarks are based in the National TRU Program Quality Assurance Program Plan. These requirements include the ability to determine total bias and total measurement uncertainty. These parameters must be completely evaluated for waste types to be processed through a given nondestructive waste assay system constituting the foundation of activities undertaken in technology development projects. Once development and testing activities have been completed, Innovative Technology Summary Reports are generated to provide results and conclusions to support EM-30, -40, or -60 end user or customer technology selection. The active and passive computed tomography non-destructive assay system is one of the technologies selected for development by the MWFA. Lawrence Livermore National Laboratory (LLNL) has developed the active and passive computed tomography (A ampersand XT) nondestructive assay (NDA) technology to identify and accurately quantify all detectable radioisotopes in closed containers of waste. This technology will be applicable to all types of waste regardless of their classification-low level, transuranic or mixed. Mixed waste contains radioactivity and hazardous organic species. The scope of our technology is to develop a non-invasive waste-drum scanner that

  4. Computing facilities available to final-year students at 3 UK dental schools in 1997/8: their use, and students' attitudes to information technology.

    Science.gov (United States)

    Grigg, P; Macfarlane, T V; Shearer, A C; Jepson, N J; Stephens, C D

    2001-08-01

    To identify computer facilities available in 3 dental schools where 3 different approaches to the use of technology-based learning material have been adopted and assess dental students' perception of their own computer skills and their attitudes towards information technology. Multicentre cross sectional by questionnaire. All 181 dental students in their final year of study (1997-8). The overall participation rate was 80%. There were no differences between schools in the students' self assessment of their IT skills but only 1/3 regarded themselves as competent in basic skills and nearly 50% of students in all 3 schools felt that insufficient IT training had been provided to enable them to follow their course without difficulty. There were significant differences between schools in most of the other areas examined which reflect the different ways in which IT can be used to support the dental course. 1. Students value IT as an educational tool. 2. Their awareness of the relevance of a knowledge of information technology for their future careers remains generally low. 3. There is a need to provide effective instruction in IT skills for those dental students who do not acquire these during secondary education.

  5. Impact of energy conservation policy measures on innovation, investment and long-term development of the Swiss economy. Results from the computable induced technical change and energy (CITE) model - Final report

    Energy Technology Data Exchange (ETDEWEB)

    Bretschger, L.; Ramer, R.; Schwark, F.

    2010-09-15

    This comprehensive final report for the Swiss Federal Office of Energy (SFOE) presents the results of a study made on the Computable Induced Technical Change and Energy (CITE) model. The authors note that, in the past two centuries, the Swiss economy experienced an unprecedented increase in living standards. At the same time, the stock of various natural resources declined and the environmental conditions changed substantially. The evaluation of the sustainability of a low energy and low carbon society as well as an optimum transition to this state is discussed. An economic analysis is made and the CITE and GCE (Computable General Equilibrium) numerical simulation models are discussed. The results obtained are presented and discussed.

  6. FOCUS: a fire management planning system -- final report

    Science.gov (United States)

    Frederick W. Bratten; James B. Davis; George T. Flatman; Jerold W. Keith; Stanley R. Rapp; Theodore G. Storey

    1981-01-01

    FOCUS (Fire Operational Characteristics Using Simulation) is a computer simulation model for evaluating alternative fire management plans. This final report provides a broad overview of the FOCUS system, describes two major modules-fire suppression and cost, explains the role in the system of gaming large fires, and outlines the support programs and ways of...

  7. Modeling Computer Virus and Its Dynamics

    Directory of Open Access Journals (Sweden)

    Mei Peng

    2013-01-01

    Full Text Available Based on that the computer will be infected by infected computer and exposed computer, and some of the computers which are in suscepitible status and exposed status can get immunity by antivirus ability, a novel coumputer virus model is established. The dynamic behaviors of this model are investigated. First, the basic reproduction number R0, which is a threshold of the computer virus spreading in internet, is determined. Second, this model has a virus-free equilibrium P0, which means that the infected part of the computer disappears, and the virus dies out, and P0 is a globally asymptotically stable equilibrium if R01 then this model has only one viral equilibrium P*, which means that the computer persists at a constant endemic level, and P* is also globally asymptotically stable. Finally, some numerical examples are given to demonstrate the analytical results.

  8. Parallel Computing:. Some Activities in High Energy Physics

    Science.gov (United States)

    Willers, Ian

    This paper examines some activities in High Energy Physics that utilise parallel computing. The topic includes all computing from the proposed SIMD front end detectors, the farming applications, high-powered RISC processors and the large machines in the computer centers. We start by looking at the motivation behind using parallelism for general purpose computing. The developments around farming are then described from its simplest form to the more complex system in Fermilab. Finally, there is a list of some developments that are happening close to the experiments.

  9. A RECIPE FOR LINEAR COLLIDER FINAL FOCUS SYSTEM DESIGN

    International Nuclear Information System (INIS)

    Seryi, Andrei

    2003-01-01

    The design of Final Focus systems for linear colliders is challenging because of the large demagnifications needed to produce nanometer-sized beams at the interaction point. Simple first- and second-order matrix matching have proven insufficient for this task, and minimization of third- and higher-order aberrations is essential. An appropriate strategy is required for the latter to be successful. A recipe for Final Focus design, and a set of computational tools used to implement this approach, are described herein. An example of the use of this procedure is given

  10. Computer simulation of transitional process to the final stable Brayton cycle in magnetic refrigeration

    International Nuclear Information System (INIS)

    Numasawa, T.; Hashimoto, T.

    1981-01-01

    The final working cycle in the magnetic refrigeration largely depends on the heat transfer coefficient β in the system, the parameter γ of the heat inflow from the outer system to this cycle and the period tau of the cycle. Therefore, so as to make clear this dependence, the time variation of the Brayton cycle with β, γ and tau has been investigated. In the present paper the transitional process of this cycle and the dependence of the final cooling temperature of the heat load on β, γ and tau have all been shown. (orig.)

  11. Computer-Aided Authoring of Programmed Instruction for Teaching Symbol Recognition. Final Report.

    Science.gov (United States)

    Braby, Richard; And Others

    This description of AUTHOR, a computer program for the automated authoring of programmed texts designed to teach symbol recognition, includes discussions of the learning strategies incorporated in the design of the instructional materials, hardware description and the algorithm for the software, and current and future developments. Appendices…

  12. Implementation of cloud computing in higher education

    Science.gov (United States)

    Asniar; Budiawan, R.

    2016-04-01

    Cloud computing research is a new trend in distributed computing, where people have developed service and SOA (Service Oriented Architecture) based application. This technology is very useful to be implemented, especially for higher education. This research is studied the need and feasibility for the suitability of cloud computing in higher education then propose the model of cloud computing service in higher education in Indonesia that can be implemented in order to support academic activities. Literature study is used as the research methodology to get a proposed model of cloud computing in higher education. Finally, SaaS and IaaS are cloud computing service that proposed to be implemented in higher education in Indonesia and cloud hybrid is the service model that can be recommended.

  13. Adiabatic graph-state quantum computation

    International Nuclear Information System (INIS)

    Antonio, B; Anders, J; Markham, D

    2014-01-01

    Measurement-based quantum computation (MBQC) and holonomic quantum computation (HQC) are two very different computational methods. The computation in MBQC is driven by adaptive measurements executed in a particular order on a large entangled state. In contrast in HQC the system starts in the ground subspace of a Hamiltonian which is slowly changed such that a transformation occurs within the subspace. Following the approach of Bacon and Flammia, we show that any MBQC on a graph state with generalized flow (gflow) can be converted into an adiabatically driven holonomic computation, which we call adiabatic graph-state quantum computation (AGQC). We then investigate how properties of AGQC relate to the properties of MBQC, such as computational depth. We identify a trade-off that can be made between the number of adiabatic steps in AGQC and the norm of H-dot as well as the degree of H, in analogy to the trade-off between the number of measurements and classical post-processing seen in MBQC. Finally the effects of performing AGQC with orderings that differ from standard MBQC are investigated. (paper)

  14. An Introduction to Quantum Computing, Without the Physics

    OpenAIRE

    Nannicini, Giacomo

    2017-01-01

    This paper is a gentle but rigorous introduction to quantum computing intended for discrete mathematicians. Starting from a small set of assumptions on the behavior of quantum computing devices, we analyze their main characteristics, stressing the differences with classical computers, and finally describe two well-known algorithms (Simon's algorithm and Grover's algorithm) using the formalism developed in previous sections. This paper does not touch on the physics of the devices, and therefor...

  15. Computational materials design

    International Nuclear Information System (INIS)

    Snyder, R.L.

    1999-01-01

    Full text: Trial and error experimentation is an extremely expensive route to the development of new materials. The coming age of reduced defense funding will dramatically alter the way in which advanced materials have developed. In the absence of large funding we must concentrate on reducing the time and expense that the R and D of a new material consumes. This may be accomplished through the development of computational materials science. Materials are selected today by comparing the technical requirements to the materials databases. When existing materials cannot meet the requirements we explore new systems to develop a new material using experimental databases like the PDF. After proof of concept, the scaling of the new material to manufacture requires evaluating millions of parameter combinations to optimize the performance of the new device. Historically this process takes 10 to 20 years and requires hundreds of millions of dollars. The development of a focused set of computational tools to predict the final properties of new materials will permit the exploration of new materials systems with only a limited amount of materials characterization. However, to bound computational extrapolations, the experimental formulations and characterization will need to be tightly coupled to the computational tasks. The required experimental data must be obtained by dynamic, in-situ, very rapid characterization. Finally, to evaluate the optimization matrix required to manufacture the new material, very rapid in situ analysis techniques will be essential to intelligently monitor and optimize the formation of a desired microstructure. Techniques and examples for the rapid real-time application of XRPD and optical microscopy will be shown. Recent developments in the cross linking of the world's structural and diffraction databases will be presented as the basis for the future Total Pattern Analysis by XRPD. Copyright (1999) Australian X-ray Analytical Association Inc

  16. Security and privacy in billing services in cloud computing

    OpenAIRE

    Μακρή, Ελένη - Λασκαρίνα

    2013-01-01

    The purpose of this master thesis is to define cloud computing and to introduce its basic principles. Firstly, the history of cloud computing will be briefly discussed, starting from the past and ending up to the current and future situation. Furthermore, the most important characteristics of cloud computing, such as security, privacy and cost, will be analyzed. Moreover the three service and three deployment models of cloud computing will be defined and analyzed with examples. Finally, the a...

  17. Final report for Conference Support Grant "From Computational Biophysics to Systems Biology - CBSB12"

    Energy Technology Data Exchange (ETDEWEB)

    Hansmann, Ulrich H.E.

    2012-07-02

    This report summarizes the outcome of the international workshop From Computational Biophysics to Systems Biology (CBSB12) which was held June 3-5, 2012, at the University of Tennessee Conference Center in Knoxville, TN, and supported by DOE through the Conference Support Grant 120174. The purpose of CBSB12 was to provide a forum for the interaction between a data-mining interested systems biology community and a simulation and first-principle oriented computational biophysics/biochemistry community. CBSB12 was the sixth in a series of workshops of the same name organized in recent years, and the second that has been held in the USA. As in previous years, it gave researchers from physics, biology, and computer science an opportunity to acquaint each other with current trends in computational biophysics and systems biology, to explore venues of cooperation, and to establish together a detailed understanding of cells at a molecular level. The conference grant of $10,000 was used to cover registration fees and provide travel fellowships to selected students and postdoctoral scientists. By educating graduate students and providing a forum for young scientists to perform research into the working of cells at a molecular level, the workshop adds to DOE's mission of paving the way to exploit the abilities of living systems to capture, store and utilize energy.

  18. Statistical and Computational Techniques in Manufacturing

    CERN Document Server

    2012-01-01

    In recent years, interest in developing statistical and computational techniques for applied manufacturing engineering has been increased. Today, due to the great complexity of manufacturing engineering and the high number of parameters used, conventional approaches are no longer sufficient. Therefore, in manufacturing, statistical and computational techniques have achieved several applications, namely, modelling and simulation manufacturing processes, optimization manufacturing parameters, monitoring and control, computer-aided process planning, etc. The present book aims to provide recent information on statistical and computational techniques applied in manufacturing engineering. The content is suitable for final undergraduate engineering courses or as a subject on manufacturing at the postgraduate level. This book serves as a useful reference for academics, statistical and computational science researchers, mechanical, manufacturing and industrial engineers, and professionals in industries related to manu...

  19. Review of tolerances at the Final Focus Test Beam

    International Nuclear Information System (INIS)

    Bulos, F.; Burke, D.; Helm, R.; Irwin, J.; Roy, G.; Yamamoto, N.

    1991-01-01

    The authors review the tolerances associated with the Final Focus Test Beam (FFTB). The authors have computed the acceptability window of the input beam for orbit jitter, emittance beta functions mismatch, incoming dispersion and coupling; tolerances on magnet alignment, strength and multipole content; and the initial tuneability capture of the line

  20. EZLP: An Interactive Computer Program for Solving Linear Programming Problems. Final Report.

    Science.gov (United States)

    Jarvis, John J.; And Others

    Designed for student use in solving linear programming problems, the interactive computer program described (EZLP) permits the student to input the linear programming model in exactly the same manner in which it would be written on paper. This report includes a brief review of the development of EZLP; narrative descriptions of program features,…

  1. Computer Security at Nuclear Facilities

    International Nuclear Information System (INIS)

    Cavina, A.

    2013-01-01

    This series of slides presents the IAEA policy concerning the development of recommendations and guidelines for computer security at nuclear facilities. A document of the Nuclear Security Series dedicated to this issue is on the final stage prior to publication. This document is the the first existing IAEA document specifically addressing computer security. This document was necessary for 3 mains reasons: first not all national infrastructures have recognized and standardized computer security, secondly existing international guidance is not industry specific and fails to capture some of the key issues, and thirdly the presence of more or less connected digital systems is increasing in the design of nuclear power plants. The security of computer system must be based on a graded approach: the assignment of computer system to different levels and zones should be based on their relevance to safety and security and the risk assessment process should be allowed to feed back into and influence the graded approach

  2. Emerging Trends in Heart Valve Engineering: Part IV. Computational Modeling and Experimental Studies.

    Science.gov (United States)

    Kheradvar, Arash; Groves, Elliott M; Falahatpisheh, Ahmad; Mofrad, Mohammad K; Hamed Alavi, S; Tranquillo, Robert; Dasi, Lakshmi P; Simmons, Craig A; Jane Grande-Allen, K; Goergen, Craig J; Baaijens, Frank; Little, Stephen H; Canic, Suncica; Griffith, Boyce

    2015-10-01

    In this final portion of an extensive review of heart valve engineering, we focus on the computational methods and experimental studies related to heart valves. The discussion begins with a thorough review of computational modeling and the governing equations of fluid and structural interaction. We then move onto multiscale and disease specific modeling. Finally, advanced methods related to in vitro testing of the heart valves are reviewed. This section of the review series is intended to illustrate application of computational methods and experimental studies and their interrelation for studying heart valves.

  3. Computation for LHC experiments: a worldwide computing grid

    International Nuclear Information System (INIS)

    Fairouz, Malek

    2010-01-01

    In normal operating conditions the LHC detectors are expected to record about 10 10 collisions each year. The processing of all the consequent experimental data is a real computing challenge in terms of equipment, software and organization: it requires sustaining data flows of a few 10 9 octets per second and recording capacity of a few tens of 10 15 octets each year. In order to meet this challenge a computing network implying the dispatch and share of tasks, has been set. The W-LCG grid (World wide LHC computing grid) is made up of 4 tiers. Tiers 0 is the computer center in CERN, it is responsible for collecting and recording the raw data from the LHC detectors and to dispatch it to the 11 tiers 1. The tiers 1 is typically a national center, it is responsible for making a copy of the raw data and for processing it in order to recover relevant data with a physical meaning and to transfer the results to the 150 tiers 2. The tiers 2 is at the level of the Institute or laboratory, it is in charge of the final analysis of the data and of the production of the simulations. Tiers 3 are at the level of the laboratories, they provide a complementary and local resource to tiers 2 in terms of data analysis. (A.C.)

  4. Computer-Aided Drug Design in Epigenetics

    Science.gov (United States)

    Lu, Wenchao; Zhang, Rukang; Jiang, Hao; Zhang, Huimin; Luo, Cheng

    2018-03-01

    Epigenetic dysfunction has been widely implicated in several diseases especially cancers thus highlights the therapeutic potential for chemical interventions in this field. With rapid development of computational methodologies and high-performance computational resources, computer-aided drug design has emerged as a promising strategy to speed up epigenetic drug discovery. Herein, we make a brief overview of major computational methods reported in the literature including druggability prediction, virtual screening, homology modeling, scaffold hopping, pharmacophore modeling, molecular dynamics simulations, quantum chemistry calculation and 3D quantitative structure activity relationship that have been successfully applied in the design and discovery of epi-drugs and epi-probes. Finally, we discuss about major limitations of current virtual drug design strategies in epigenetics drug discovery and future directions in this field.

  5. Computer-Aided Drug Design in Epigenetics

    Science.gov (United States)

    Lu, Wenchao; Zhang, Rukang; Jiang, Hao; Zhang, Huimin; Luo, Cheng

    2018-01-01

    Epigenetic dysfunction has been widely implicated in several diseases especially cancers thus highlights the therapeutic potential for chemical interventions in this field. With rapid development of computational methodologies and high-performance computational resources, computer-aided drug design has emerged as a promising strategy to speed up epigenetic drug discovery. Herein, we make a brief overview of major computational methods reported in the literature including druggability prediction, virtual screening, homology modeling, scaffold hopping, pharmacophore modeling, molecular dynamics simulations, quantum chemistry calculation, and 3D quantitative structure activity relationship that have been successfully applied in the design and discovery of epi-drugs and epi-probes. Finally, we discuss about major limitations of current virtual drug design strategies in epigenetics drug discovery and future directions in this field. PMID:29594101

  6. Quantum computers: Definition and implementations

    International Nuclear Information System (INIS)

    Perez-Delgado, Carlos A.; Kok, Pieter

    2011-01-01

    The DiVincenzo criteria for implementing a quantum computer have been seminal in focusing both experimental and theoretical research in quantum-information processing. These criteria were formulated specifically for the circuit model of quantum computing. However, several new models for quantum computing (paradigms) have been proposed that do not seem to fit the criteria well. Therefore, the question is what are the general criteria for implementing quantum computers. To this end, a formal operational definition of a quantum computer is introduced. It is then shown that, according to this definition, a device is a quantum computer if it obeys the following criteria: Any quantum computer must consist of a quantum memory, with an additional structure that (1) facilitates a controlled quantum evolution of the quantum memory; (2) includes a method for information theoretic cooling of the memory; and (3) provides a readout mechanism for subsets of the quantum memory. The criteria are met when the device is scalable and operates fault tolerantly. We discuss various existing quantum computing paradigms and how they fit within this framework. Finally, we present a decision tree for selecting an avenue toward building a quantum computer. This is intended to help experimentalists determine the most natural paradigm given a particular physical implementation.

  7. Review of tolerances at the Final Focus Test Beam

    International Nuclear Information System (INIS)

    Bulos, F.; Burke, D.; Helm, R.; Irwin, J.; Roy, G.; Yamamoto, N.

    1991-05-01

    We review the tolerances associated with the Final Focus Test Beam (FFTB). We have computed the acceptability window of the input beam for orbit jitter, emittance beta functions mismatch, incoming dispersion and coupling; tolerances on magnet alignment, strength and multipole content; and the initial tuneability capture of the line. 2 refs., 1 fig

  8. Structure problems in the analog computation

    International Nuclear Information System (INIS)

    Braffort, P.L.

    1957-01-01

    The recent mathematical development showed the importance of elementary structures (algebraic, topological, etc.) in abeyance under the great domains of classical analysis. Such structures in analog computation are put in evidence and possible development of applied mathematics are discussed. It also studied the topological structures of the standard representation of analog schemes such as additional triangles, integrators, phase inverters and functions generators. The analog method gives only the function of the variable: time, as results of its computations. But the course of computation, for systems including reactive circuits, introduces order structures which are called 'chronological'. Finally, it showed that the approximation methods of ordinary numerical and digital computation present the same structure as these analog computation. The structure analysis permits fruitful comparisons between the several domains of applied mathematics and suggests new important domains of application for analog method. (M.P.)

  9. Relativistic quantum chemistry on quantum computers

    DEFF Research Database (Denmark)

    Veis, L.; Visnak, J.; Fleig, T.

    2012-01-01

    The past few years have witnessed a remarkable interest in the application of quantum computing for solving problems in quantum chemistry more efficiently than classical computers allow. Very recently, proof-of-principle experimental realizations have been reported. However, so far only...... the nonrelativistic regime (i.e., the Schrodinger equation) has been explored, while it is well known that relativistic effects can be very important in chemistry. We present a quantum algorithm for relativistic computations of molecular energies. We show how to efficiently solve the eigenproblem of the Dirac......-Coulomb Hamiltonian on a quantum computer and demonstrate the functionality of the proposed procedure by numerical simulations of computations of the spin-orbit splitting in the SbH molecule. Finally, we propose quantum circuits with three qubits and nine or ten controlled-NOT (CNOT) gates, which implement a proof...

  10. Electromagnetic Compatibility Design of the Computer Circuits

    Science.gov (United States)

    Zitai, Hong

    2018-02-01

    Computers and the Internet have gradually penetrated into every aspect of people’s daily work. But with the improvement of electronic equipment as well as electrical system, the electromagnetic environment becomes much more complex. Electromagnetic interference has become an important factor to hinder the normal operation of electronic equipment. In order to analyse the computer circuit compatible with the electromagnetic compatibility, this paper starts from the computer electromagnetic and the conception of electromagnetic compatibility. And then, through the analysis of the main circuit and system of computer electromagnetic compatibility problems, we can design the computer circuits in term of electromagnetic compatibility. Finally, the basic contents and methods of EMC test are expounded in order to ensure the electromagnetic compatibility of equipment.

  11. A qualitative and quantitative laser-based computer-aided flow visualization method. M.S. Thesis, 1992 Final Report

    Science.gov (United States)

    Canacci, Victor A.; Braun, M. Jack

    1994-01-01

    The experimental approach presented here offers a nonintrusive, qualitative and quantitative evaluation of full field flow patterns applicable in various geometries in a variety of fluids. This Full Flow Field Tracking (FFFT) Particle Image Velocimetry (PIV) technique, by means of particle tracers illuminated by a laser light sheet, offers an alternative to Laser Doppler Velocimetry (LDV), and intrusive systems such as Hot Wire/Film Anemometry. The method makes obtainable the flow patterns, and allows quantitative determination of the velocities, accelerations, and mass flows of an entire flow field. The method uses a computer based digitizing system attached through an imaging board to a low luminosity camera. A customized optical train allows the system to become a long distance microscope (LDM), allowing magnifications of areas of interest ranging up to 100 times. Presented in addition to the method itself, are studies in which the flow patterns and velocities were observed and evaluated in three distinct geometries, with three different working fluids. The first study involved pressure and flow analysis of a brush seal in oil. The next application involved studying the velocity and flow patterns in a cowl lip cooling passage of an air breathing aircraft engine using water as the working fluid. Finally, the method was extended to a study in air to examine the flows in a staggered pin arrangement located on one side of a branched duct.

  12. Modular Universal Scalable Ion-trap Quantum Computer

    Science.gov (United States)

    2016-06-02

    SECURITY CLASSIFICATION OF: The main goal of the original MUSIQC proposal was to construct and demonstrate a modular and universally- expandable ion...Distribution Unlimited UU UU UU UU 02-06-2016 1-Aug-2010 31-Jan-2016 Final Report: Modular Universal Scalable Ion-trap Quantum Computer The views...P.O. Box 12211 Research Triangle Park, NC 27709-2211 Ion trap quantum computation, scalable modular architectures REPORT DOCUMENTATION PAGE 11

  13. Computational physics: an introduction (second edition)

    International Nuclear Information System (INIS)

    Borcherds, Peter

    2002-01-01

    This book has much in common with many other books on Computational Physics texts, some of which are helpfully listed by the author as 'A subjective review on related texts'. The first five chapters are introductory, covering finite differences, linear algebra, stochastics and ordinary and partial differential equations. The final section of chapter 3 is entitled 'Stochastic Optimisation', and covers Simulated Annealing and Genetic Algorithms. Neither topic is adequately covered; an explicit example, with algorithms, in each case would have been helpful. However few other computational physics texts mention these topics at all. The chapters in the final part of the book are more advanced, and cover comprehensively Simulation and Statistical Mechanics, Quantum Mechanical Simulation and Hydrodynamics. These chapters include specialist material not in other texts, e.g. Alder vortices and the Nose--Hoover method. There is an extensive coverage of Ewald summation. The author is in the course of augmenting his book by web-resident sample programs, which should enhance the value of the book. This book should appeal to anyone working in the fields covered in the final section. It ought also to be in any physics library. (author)

  14. Use of cloud computing in biomedicine.

    Science.gov (United States)

    Sobeslav, Vladimir; Maresova, Petra; Krejcar, Ondrej; Franca, Tanos C C; Kuca, Kamil

    2016-12-01

    Nowadays, biomedicine is characterised by a growing need for processing of large amounts of data in real time. This leads to new requirements for information and communication technologies (ICT). Cloud computing offers a solution to these requirements and provides many advantages, such as cost savings, elasticity and scalability of using ICT. The aim of this paper is to explore the concept of cloud computing and the related use of this concept in the area of biomedicine. Authors offer a comprehensive analysis of the implementation of the cloud computing approach in biomedical research, decomposed into infrastructure, platform and service layer, and a recommendation for processing large amounts of data in biomedicine. Firstly, the paper describes the appropriate forms and technological solutions of cloud computing. Secondly, the high-end computing paradigm of cloud computing aspects is analysed. Finally, the potential and current use of applications in scientific research of this technology in biomedicine is discussed.

  15. [Experimental nuclear physics]. Final report

    International Nuclear Information System (INIS)

    1991-04-01

    This is the final report of the Nuclear Physics Laboratory of the University of Washington on work supported in part by US Department of Energy contract DE-AC06-81ER40048. It contains chapters on giant dipole resonances in excited nuclei, nucleus-nucleus reactions, astrophysics, polarization in nuclear reactions, fundamental symmetries and interactions, accelerator mass spectrometry (AMS), ultra-relativistic heavy ions, medium energy reactions, work by external users, instrumentation, accelerators and ion sources, and computer systems. An appendix lists Laboratory personnel, a Ph. D. degree granted in the 1990-1991 academic year, and publications. Refs., 41 figs., 7 tabs

  16. [Experimental nuclear physics]. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1991-04-01

    This is the final report of the Nuclear Physics Laboratory of the University of Washington on work supported in part by US Department of Energy contract DE-AC06-81ER40048. It contains chapters on giant dipole resonances in excited nuclei, nucleus-nucleus reactions, astrophysics, polarization in nuclear reactions, fundamental symmetries and interactions, accelerator mass spectrometry (AMS), ultra-relativistic heavy ions, medium energy reactions, work by external users, instrumentation, accelerators and ion sources, and computer systems. An appendix lists Laboratory personnel, a Ph. D. degree granted in the 1990-1991 academic year, and publications. Refs., 41 figs., 7 tabs.

  17. Virtualized Network Control. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Ghani, Nasir [Univ. of New Mexico, Albuquerque, NM (United States)

    2013-02-01

    This document is the final report for the Virtualized Network Control (VNC) project, which was funded by the United States Department of Energy (DOE) Office of Science. This project was also informally referred to as Advanced Resource Computation for Hybrid Service and TOpology NEtworks (ARCHSTONE). This report provides a summary of the project's activities, tasks, deliverable, and accomplishments. It also provides a summary of the documents, software, and presentations generated as part of this projects activities. Namely, the Appendix contains an archive of the deliverables, documents, and presentations generated a part of this project.

  18. Computing Cosmic Cataclysms

    Science.gov (United States)

    Centrella, Joan M.

    2010-01-01

    The final merger of two black holes releases a tremendous amount of energy, more than the combined light from all the stars in the visible universe. This energy is emitted in the form of gravitational waves, and observing these sources with gravitational wave detectors requires that we know the pattern or fingerprint of the radiation emitted. Since black hole mergers take place in regions of extreme gravitational fields, we need to solve Einstein's equations of general relativity on a computer in order to calculate these wave patterns. For more than 30 years, scientists have tried to compute these wave patterns. However, their computer codes have been plagued by problems that caused them to crash. This situation has changed dramatically in the past few years, with a series of amazing breakthroughs. This talk will take you on this quest for these gravitational wave patterns, showing how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. We will focus on the recent advances that are revealing these waveforms, and the dramatic new potential for discoveries that arises when these sources will be observed.

  19. Final technical position on documentation of computer codes for high-level waste management

    International Nuclear Information System (INIS)

    Silling, S.A.

    1983-06-01

    Guidance is given for the content of documentation of computer codes which are used in support of a license application for high-level waste disposal. The guidelines cover theoretical basis, programming, and instructions for use of the code

  20. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  1. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  2. Self-study manual for introduction to computational fluid dynamics

    OpenAIRE

    Nabatov, Andrey

    2017-01-01

    Computational Fluid Dynamics (CFD) is the branch of Fluid Mechanics and Computational Physics that plays a decent role in modern Mechanical Engineering Design process due to such advantages as relatively low cost of simulation comparing with conduction of real experiment, an opportunity to easily correct the design of a prototype prior to manufacturing of the final product and a wide range of application: mixing, acoustics, cooling and aerodynamics. This makes CFD particularly and Computation...

  3. Guide to improving the performance of a manipulator system for nuclear fuel handling through computer controls. Final report

    International Nuclear Information System (INIS)

    Evans, J.M. Jr.; Albus, J.S.; Barbera, A.J.; Rosenthal, R.; Truitt, W.B.

    1975-11-01

    The Office of Developmental Automation and Control Technology of the Institute for Computer Sciences and Technology of the National Bureau of Standards provides advising services, standards and guidelines on interface and computer control systems, and performance specifications for the procurement and use of computer controlled manipulators and other computer based automation systems. These outputs help other agencies and industry apply this technology to increase productivity and improve work quality by removing men from hazardous environments. In FY 74 personnel from the Oak Ridge National Laboratory visited NBS to discuss the feasibility of using computer control techniques to improve the operation of remote control manipulators in nuclear fuel reprocessing. Subsequent discussions led to an agreement for NBS to develop a conceptual design for such a computer control system for the PaR Model 3000 manipulator in the Thorium Uranium Recycle Facility (TURF) at ORNL. This report provides the required analysis and conceptual design. Complete computer programs are included for testing of computer interfaces and for actual robot control in both point-to-point and continuous path modes

  4. Computer-Aided Drug Design in Epigenetics

    Directory of Open Access Journals (Sweden)

    Wenchao Lu

    2018-03-01

    Full Text Available Epigenetic dysfunction has been widely implicated in several diseases especially cancers thus highlights the therapeutic potential for chemical interventions in this field. With rapid development of computational methodologies and high-performance computational resources, computer-aided drug design has emerged as a promising strategy to speed up epigenetic drug discovery. Herein, we make a brief overview of major computational methods reported in the literature including druggability prediction, virtual screening, homology modeling, scaffold hopping, pharmacophore modeling, molecular dynamics simulations, quantum chemistry calculation, and 3D quantitative structure activity relationship that have been successfully applied in the design and discovery of epi-drugs and epi-probes. Finally, we discuss about major limitations of current virtual drug design strategies in epigenetics drug discovery and future directions in this field.

  5. Dense image correspondences for computer vision

    CERN Document Server

    Liu, Ce

    2016-01-01

    This book describes the fundamental building-block of many new computer vision systems: dense and robust correspondence estimation. Dense correspondence estimation techniques are now successfully being used to solve a wide range of computer vision problems, very different from the traditional applications such techniques were originally developed to solve. This book introduces the techniques used for establishing correspondences between challenging image pairs, the novel features used to make these techniques robust, and the many problems dense correspondences are now being used to solve. The book provides information to anyone attempting to utilize dense correspondences in order to solve new or existing computer vision problems. The editors describe how to solve many computer vision problems by using dense correspondence estimation. Finally, it surveys resources, code, and data necessary for expediting the development of effective correspondence-based computer vision systems.   ·         Provides i...

  6. Fourth SIAM conference on mathematical and computational issues in the geosciences: Final program and abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    The conference focused on computational and modeling issues in the geosciences. Of the geosciences, problems associated with phenomena occurring in the earth`s subsurface were best represented. Topics in this area included petroleum recovery, ground water contamination and remediation, seismic imaging, parameter estimation, upscaling, geostatistical heterogeneity, reservoir and aquifer characterization, optimal well placement and pumping strategies, and geochemistry. Additional sessions were devoted to the atmosphere, surface water and oceans. The central mathematical themes included computational algorithms and numerical analysis, parallel computing, mathematical analysis of partial differential equations, statistical and stochastic methods, optimization, inversion, homogenization and renormalization. The problem areas discussed at this conference are of considerable national importance, with the increasing importance of environmental issues, global change, remediation of waste sites, declining domestic energy sources and an increasing reliance on producing the most out of established oil reservoirs.

  7. Bringing Advanced Computational Techniques to Energy Research

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  8. Brain architecture: a design for natural computation.

    Science.gov (United States)

    Kaiser, Marcus

    2007-12-15

    Fifty years ago, John von Neumann compared the architecture of the brain with that of the computers he invented and which are still in use today. In those days, the organization of computers was based on concepts of brain organization. Here, we give an update on current results on the global organization of neural systems. For neural systems, we outline how the spatial and topological architecture of neuronal and cortical networks facilitates robustness against failures, fast processing and balanced network activation. Finally, we discuss mechanisms of self-organization for such architectures. After all, the organization of the brain might again inspire computer architecture.

  9. Synchrotron Imaging Computations on the Grid without the Computing Element

    International Nuclear Information System (INIS)

    Curri, A; Pugliese, R; Borghes, R; Kourousias, G

    2011-01-01

    Besides the heavy use of the Grid in the Synchrotron Radiation Facility (SRF) Elettra, additional special requirements from the beamlines had to be satisfied through a novel solution that we present in this work. In the traditional Grid Computing paradigm the computations are performed on the Worker Nodes of the grid element known as the Computing Element. A Grid middleware extension that our team has been working on, is that of the Instrument Element. In general it is used to Grid-enable instrumentation; and it can be seen as a neighbouring concept to that of the traditional Control Systems. As a further extension we demonstrate the Instrument Element as the steering mechanism for a series of computations. In our deployment it interfaces a Control System that manages a series of computational demanding Scientific Imaging tasks in an online manner. The instrument control in Elettra is done through a suitable Distributed Control System, a common approach in the SRF community. The applications that we present are for a beamline working in medical imaging. The solution resulted to a substantial improvement of a Computed Tomography workflow. The near-real-time requirements could not have been easily satisfied from our Grid's middleware (gLite) due to the various latencies often occurred during the job submission and queuing phases. Moreover the required deployment of a set of TANGO devices could not have been done in a standard gLite WN. Besides the avoidance of certain core Grid components, the Grid Security infrastructure has been utilised in the final solution.

  10. GRID : unlimited computing power on your desktop Conference MT17

    CERN Multimedia

    2001-01-01

    The Computational GRID is an analogy to the electrical power grid for computing resources. It decouples the provision of computing, data, and networking from its use, it allows large-scale pooling and sharing of resources distributed world-wide. Every computer, from a desktop to a mainframe or supercomputer, can provide computing power or data for the GRID. The final objective is to plug your computer into the wall and have direct access to huge computing resources immediately, just like plugging-in a lamp to get instant light. The GRID will facilitate world-wide scientific collaborations on an unprecedented scale. It will provide transparent access to major distributed resources of computer power, data, information, and collaborations.

  11. Integrated computer control system CORBA-based simulator FY98 LDRD project final summary report

    International Nuclear Information System (INIS)

    Bryant, R M; Holloway, F W; Van Arsdall, P J.

    1999-01-01

    The CORBA-based Simulator was a Laboratory Directed Research and Development (LDRD) project that applied simulation techniques to explore critical questions about distributed control architecture. The simulator project used a three-prong approach comprised of a study of object-oriented distribution tools, computer network modeling, and simulation of key control system scenarios. This summary report highlights the findings of the team and provides the architectural context of the study. For the last several years LLNL has been developing the Integrated Computer Control System (ICCS), which is an abstract object-oriented software framework for constructing distributed systems. The framework is capable of implementing large event-driven control systems for mission-critical facilities such as the National Ignition Facility (NIF). Tools developed in this project were applied to the NIF example architecture in order to gain experience with a complex system and derive immediate benefits from this LDRD. The ICCS integrates data acquisition and control hardware with a supervisory system, and reduces the amount of new coding and testing necessary by providing prebuilt components that can be reused and extended to accommodate specific additional requirements. The framework integrates control point hardware with a supervisory system by providing the services needed for distributed control such as database persistence, system start-up and configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. The design is interoperable among computers of different kinds and provides plug-in software connections by leveraging a common object request brokering architecture (CORBA) to transparently distribute software objects across the network of computers. Because object broker distribution applied to control systems is relatively new and its inherent performance is roughly threefold less than traditional point

  12. Computing in high-energy physics

    Science.gov (United States)

    Mount, Richard P.

    2016-04-01

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Finally, I describe recent developments aimed at improving the overall coherence of high-energy physics software.

  13. Design of a Computer-Controlled, Random-Access Slide Projector Interface. Final Report (April 1974 - November 1974).

    Science.gov (United States)

    Kirby, Paul J.; And Others

    The design, development, test, and evaluation of an electronic hardware device interfacing a commercially available slide projector with a plasma panel computer terminal is reported. The interface device allows an instructional computer program to select slides for viewing based upon the lesson student situation parameters of the instructional…

  14. Optical computer switching network

    Science.gov (United States)

    Clymer, B.; Collins, S. A., Jr.

    1985-01-01

    The design for an optical switching system for minicomputers that uses an optical spatial light modulator such as a Hughes liquid crystal light valve is presented. The switching system is designed to connect 80 minicomputers coupled to the switching system by optical fibers. The system has two major parts: the connection system that connects the data lines by which the computers communicate via a two-dimensional optical matrix array and the control system that controls which computers are connected. The basic system, the matrix-based connecting system, and some of the optical components to be used are described. Finally, the details of the control system are given and illustrated with a discussion of timing.

  15. Plasma lenses for SLAC Final Focus Test facility

    International Nuclear Information System (INIS)

    Betz, D.; Cline, D.; Joshi, C.; Rajagopalan, S.; Rosenzweig, J.; Su, J.J.; Williams, R.; Chen, P.; Gundersen, M.; Katsouleas, T.; Norem, J.

    1991-01-01

    A collaborative group of accelerator and plasma physicists and engineers has formed with an interest in exploring the use of plasma lenses to meet the needs of future colliders. Analytic and computational models of plasma lenses are briefly reviewed and several design examples for the SLAC Final Focus Test Beam are presented. The examples include discrete, thick, and adiabatic lenses. A potential plasma source with desirable lens characteristics is presented

  16. URSULA2 computer program. Volume 3. User's manual. Final report

    International Nuclear Information System (INIS)

    Singhal, A.K.

    1980-01-01

    This report is intended to provide documentation for the users of the URSULA2 code so that they can appreciate its important features such as: code structure, flow chart, grid notations, coding style, usage of secondary storage and its interconnection with the input preparation program (Reference H3201/4). Subroutines and subprograms have been divided into four functional groups. The functions of all subroutines have been explained with particular emphasis on the control subroutine (MAIN) and the data input subroutine (BLOCK DATA). Computations for the flow situations similar to the reference case can be performed simply by making alterations in BLOCK DATA. Separate guides for the preparation of input data and for the interpretation of program output have been provided. Furthermore, two appendices; one for the URSULA2 listing and the second for the glossary of FORTRAN variables, are included to make this report self-sufficient

  17. Digitalization of Education System and Teacher Educators' Computer Skill in Bangladesh

    Science.gov (United States)

    Rahman, Mohammad Ataur

    2011-01-01

    This study examined how teacher educators perceive the incorporation and use of computer technology resources in Teachers' Training Colleges in Bangladesh. This study encompasses the thorough investigation of teacher educators' "computer skills" by using the valid and reliable instruments. The study finally examined whether any…

  18. 9th Symposium on Computational Statistics

    CERN Document Server

    Mildner, Vesna

    1990-01-01

    Although no-one is, probably, too enthused about the idea, it is a fact that the development of most empirical sciences to a great extent depends on the development of data analysis methods and techniques, which, due to the necessity of application of computers for that purpose, actually means that it practically depends on the advancement and orientation of computer statistics. Every other year the International Association for Statistical Computing sponsors the organizition of meetings of individual s professiona77y involved in computational statistics. Since these meetings attract professionals from allover the world, they are a good sample for the estimation of trends in this area which some believe is a statistics proper while others claim it is computer science. It seems, though, that an increasing number of colleagues treat it as an independent scientific or at least technical discipline. This volume contains six invited papers, 41 contributed papers and, finally, two papers which are, formally, softwa...

  19. Trusted computing platforms TPM2.0 in context

    CERN Document Server

    Proudler, Graeme; Dalton, Chris

    2015-01-01

    In this book the authors first describe the background of trusted platforms and trusted computing and speculate about the future. They then describe the technical features and architectures of trusted platforms from several different perspectives, finally explaining second-generation TPMs, including a technical description intended to supplement the Trusted Computing Group's TPM2 specifications. The intended audience is IT managers and engineers and graduate students in information security.

  20. Final Report on XStack: Software Synthesis for High Productivity ExaScale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Solar-Lezama, Armando [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States). Computer Science and Artificial Intelligence Lab.

    2016-07-12

    The goal of the project was to develop a programming model that would significantly improve productivity in the high-performance computing domain by bringing together three components: a) Automated equivalence checking, b) Sketch-based program synthesis, and c) Autotuning. The report provides an executive summary of the research accomplished through this project. At the end of the report is appended a paper that describes in more detail the key technical accomplishments from this project, and which was published in SC 2014.

  1. A supervisor system for computer aided laser machining

    International Nuclear Information System (INIS)

    Mukherjee, J.K.

    1990-01-01

    Lasers achieve non divergent beam of short wavelength energy which can propagate through normal atmosphere with little divergence and can be focused on very fine points. The final high energy per unit area on target is highly localised and suitable for various types of machining at high speeds. The most notable factor is that this high energy spot can be located precisely using light-weight optical components. The laser-machining is very amenable to environmental conditions unlike electron beam and other techniques. Precision cutting and welding of nuclear materials in normal or non oxidising atmosphere can be done using this fairly easily. To achieve these objectives, development of a computer controlled laser machining system has been undertaken. The development project aims at building a computer aided machine with indegenous controller and medium power laser suitable for cutting, welding, and marking. This paper describes the integration of the various computer aided functions, spanning over the full range, from job-defining to final finished part-delivary, in computer aided laser machining. Various innovative features of the system that render it suitable for laser tool development as well as for special machining applications with user-friendliness have been covered. (author). 5 refs., 5 figs

  2. "Type Ia Supernovae: Tools for Studying Dark Energy" Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Woosley, Stan [Lick Observatory, San Jose, CA (United States); Kasen, Dan [Univ. of California, Berkeley, CA (United States)

    2017-05-10

    Final technical report for project "Type Ia Supernovae: Tools for the Study of Dark Energy" awarded jointly to scientists at the University of California, Santa Cruz and Berkeley, for computer modeling, theory and data analysis relevant to the use of Type Ia supernovae as standard candles for cosmology.

  3. Computer Problem-Solving Coaches for Introductory Physics: Design and Usability Studies

    Science.gov (United States)

    Ryan, Qing X.; Frodermann, Evan; Heller, Kenneth; Hsu, Leonardo; Mason, Andrew

    2016-01-01

    The combination of modern computing power, the interactivity of web applications, and the flexibility of object-oriented programming may finally be sufficient to create computer coaches that can help students develop metacognitive problem-solving skills, an important competence in our rapidly changing technological society. However, no matter how…

  4. TORCH Computational Reference Kernels - A Testbed for Computer Science Research

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, Alex; Williams, Samuel Webb; Madduri, Kamesh; Ibrahim, Khaled; Bailey, David H.; Demmel, James W.; Strohmaier, Erich

    2010-12-02

    For decades, computer scientists have sought guidance on how to evolve architectures, languages, and programming models in order to improve application performance, efficiency, and productivity. Unfortunately, without overarching advice about future directions in these areas, individual guidance is inferred from the existing software/hardware ecosystem, and each discipline often conducts their research independently assuming all other technologies remain fixed. In today's rapidly evolving world of on-chip parallelism, isolated and iterative improvements to performance may miss superior solutions in the same way gradient descent optimization techniques may get stuck in local minima. To combat this, we present TORCH: A Testbed for Optimization ResearCH. These computational reference kernels define the core problems of interest in scientific computing without mandating a specific language, algorithm, programming model, or implementation. To compliment the kernel (problem) definitions, we provide a set of algorithmically-expressed verification tests that can be used to verify a hardware/software co-designed solution produces an acceptable answer. Finally, to provide some illumination as to how researchers have implemented solutions to these problems in the past, we provide a set of reference implementations in C and MATLAB.

  5. Computational synthetic geometry

    CERN Document Server

    Bokowski, Jürgen

    1989-01-01

    Computational synthetic geometry deals with methods for realizing abstract geometric objects in concrete vector spaces. This research monograph considers a large class of problems from convexity and discrete geometry including constructing convex polytopes from simplicial complexes, vector geometries from incidence structures and hyperplane arrangements from oriented matroids. It turns out that algorithms for these constructions exist if and only if arbitrary polynomial equations are decidable with respect to the underlying field. Besides such complexity theorems a variety of symbolic algorithms are discussed, and the methods are applied to obtain new mathematical results on convex polytopes, projective configurations and the combinatorics of Grassmann varieties. Finally algebraic varieties characterizing matroids and oriented matroids are introduced providing a new basis for applying computer algebra methods in this field. The necessary background knowledge is reviewed briefly. The text is accessible to stud...

  6. Algorithmic Mechanism Design of Evolutionary Computation.

    Science.gov (United States)

    Pei, Yan

    2015-01-01

    We consider algorithmic design, enhancement, and improvement of evolutionary computation as a mechanism design problem. All individuals or several groups of individuals can be considered as self-interested agents. The individuals in evolutionary computation can manipulate parameter settings and operations by satisfying their own preferences, which are defined by an evolutionary computation algorithm designer, rather than by following a fixed algorithm rule. Evolutionary computation algorithm designers or self-adaptive methods should construct proper rules and mechanisms for all agents (individuals) to conduct their evolution behaviour correctly in order to definitely achieve the desired and preset objective(s). As a case study, we propose a formal framework on parameter setting, strategy selection, and algorithmic design of evolutionary computation by considering the Nash strategy equilibrium of a mechanism design in the search process. The evaluation results present the efficiency of the framework. This primary principle can be implemented in any evolutionary computation algorithm that needs to consider strategy selection issues in its optimization process. The final objective of our work is to solve evolutionary computation design as an algorithmic mechanism design problem and establish its fundamental aspect by taking this perspective. This paper is the first step towards achieving this objective by implementing a strategy equilibrium solution (such as Nash equilibrium) in evolutionary computation algorithm.

  7. Symbolic computation of nonlinear wave interactions on MACSYMA

    International Nuclear Information System (INIS)

    Bers, A.; Kulp, J.L.; Karney, C.F.F.

    1976-01-01

    In this paper the use of a large symbolic computation system - MACSYMA - in determining approximate analytic expressions for the nonlinear coupling of waves in an anisotropic plasma is described. MACSYMA was used to implement the solutions of a fluid plasma model nonlinear partial differential equations by perturbation expansions and subsequent iterative analytic computations. By interacting with the details of the symbolic computation, the physical processes responsible for particular nonlinear wave interactions could be uncovered and appropriate approximations introduced so as to simplify the final analytic result. Details of the MACSYMA system and its use are discussed and illustrated. (Auth.)

  8. Morphing-Based Shape Optimization in Computational Fluid Dynamics

    Science.gov (United States)

    Rousseau, Yannick; Men'Shov, Igor; Nakamura, Yoshiaki

    In this paper, a Morphing-based Shape Optimization (MbSO) technique is presented for solving Optimum-Shape Design (OSD) problems in Computational Fluid Dynamics (CFD). The proposed method couples Free-Form Deformation (FFD) and Evolutionary Computation, and, as its name suggests, relies on the morphing of shape and computational domain, rather than direct shape parameterization. Advantages of the FFD approach compared to traditional parameterization are first discussed. Then, examples of shape and grid deformations by FFD are presented. Finally, the MbSO approach is illustrated and applied through an example: the design of an airfoil for a future Mars exploration airplane.

  9. Computation of water hammer protection of modernized pumping station

    Directory of Open Access Journals (Sweden)

    Himr Daniel

    2014-03-01

    Finally, the pump trip was performed to verify if the system worked correctly. The test showed that pressure pulsations are lower (better than computation predicted. This discrepancy was further analysed.

  10. Computation of Phase Equilibrium and Phase Envelopes

    DEFF Research Database (Denmark)

    Ritschel, Tobias Kasper Skovborg; Jørgensen, John Bagterp

    formulate the involved equations in terms of the fugacity coefficients. We present expressions for the first-order derivatives. Such derivatives are necessary in computationally efficient gradient-based methods for solving the vapor-liquid equilibrium equations and for computing phase envelopes. Finally, we......In this technical report, we describe the computation of phase equilibrium and phase envelopes based on expressions for the fugacity coefficients. We derive those expressions from the residual Gibbs energy. We consider 1) ideal gases and liquids modeled with correlations from the DIPPR database...... and 2) nonideal gases and liquids modeled with cubic equations of state. Next, we derive the equilibrium conditions for an isothermal-isobaric (constant temperature, constant pressure) vapor-liquid equilibrium process (PT flash), and we present a method for the computation of phase envelopes. We...

  11. Convolutional Deep Belief Networks for Single-Cell/Object Tracking in Computational Biology and Computer Vision.

    Science.gov (United States)

    Zhong, Bineng; Pan, Shengnan; Zhang, Hongbo; Wang, Tian; Du, Jixiang; Chen, Duansheng; Cao, Liujuan

    2016-01-01

    In this paper, we propose deep architecture to dynamically learn the most discriminative features from data for both single-cell and object tracking in computational biology and computer vision. Firstly, the discriminative features are automatically learned via a convolutional deep belief network (CDBN). Secondly, we design a simple yet effective method to transfer features learned from CDBNs on the source tasks for generic purpose to the object tracking tasks using only limited amount of training data. Finally, to alleviate the tracker drifting problem caused by model updating, we jointly consider three different types of positive samples. Extensive experiments validate the robustness and effectiveness of the proposed method.

  12. Cosmology Without Finality

    Science.gov (United States)

    Mahootian, F.

    2009-12-01

    The rapid convergence of advancing sensor technology, computational power, and knowledge discovery techniques over the past decade has brought unprecedented volumes of astronomical data together with unprecedented capabilities of data assimilation and analysis. A key result is that a new, data-driven "observational-inductive'' framework for scientific inquiry is taking shape and proving viable. The anticipated rise in data flow and processing power will have profound effects, e.g., confirmations and disconfirmations of existing theoretical claims both for and against the big bang model. But beyond enabling new discoveries can new data-driven frameworks of scientific inquiry reshape the epistemic ideals of science? The history of physics offers a comparison. The Bohr-Einstein debate over the "completeness'' of quantum mechanics centered on a question of ideals: what counts as science? We briefly examine lessons from that episode and pose questions about their applicability to cosmology. If the history of 20th century physics is any indication, the abandonment of absolutes (e.g., space, time, simultaneity, continuity, determinacy) can produce fundamental changes in understanding. The classical ideal of science, operative in both physics and cosmology, descends from the European Enlightenment. This ideal has for over 200 years guided science to seek the ultimate order of nature, to pursue the absolute theory, the "theory of everything.'' But now that we have new models of scientific inquiry powered by new technologies and driven more by data than by theory, it is time, finally, to relinquish dreams of a "final'' theory.

  13. The development of mobile computation and the related formal description

    International Nuclear Information System (INIS)

    Jin Yan; Yang Xiaozong

    2003-01-01

    The description and research for formal representation in mobile computation, which is very instructive to resolve the status transmission, domain administration, authentication. This paper presents the descriptive communicating process and computational process from the view of formal calculus, what's more, it construct a practical application used by mobile ambient. Finally, this dissertation shows the future work and direction. (authors)

  14. Computational methods in several fields of radiation dosimetry

    International Nuclear Information System (INIS)

    Paretzke, Herwig G.

    2010-01-01

    Full text: Radiation dosimetry has to cope with a wide spectrum of applications and requirements in time and size. The ubiquitous presence of various radiation fields or radionuclides in the human home, working, urban or agricultural environment can lead to various dosimetric tasks starting from radioecology, retrospective and predictive dosimetry, personal dosimetry, up to measurements of radionuclide concentrations in environmental and food product and, finally in persons and their excreta. In all these fields measurements and computational models for the interpretation or understanding of observations are employed explicitly or implicitly. In this lecture some examples of own computational models will be given from the various dosimetric fields, including a) Radioecology (e.g. with the code systems based on ECOSYS, which was developed far before the Chernobyl reactor accident, and tested thoroughly afterwards), b) Internal dosimetry (improved metabolism models based on our own data), c) External dosimetry (with the new ICRU-ICRP-Voxelphantom developed by our lab), d) Radiation therapy (with GEANT IV as applied to mixed reactor radiation incident on individualized voxel phantoms), e) Some aspects of nanodosimetric track structure computations (not dealt with in the other presentation of this author). Finally, some general remarks will be made on the high explicit or implicit importance of computational models in radiation protection and other research field dealing with large systems, as well as on good scientific practices which should generally be followed when developing and applying such computational models

  15. XOQDOQ: computer program for the meteorological evaluation of routine effluent releases at nuclear power stations. Final report

    International Nuclear Information System (INIS)

    Sagendorf, J.F.; Goll, J.T.; Sandusky, W.F.

    1982-09-01

    Provided is a user's guide for the US Nuclear Regulatory Commission's (NRC) computer program X0QDOQ which implements Regulatory Guide 1.111. This NUREG supercedes NUREG-0324 which was published as a draft in September 1977. This program is used by the NRC meteorology staff in their independent meteorological evaluation of routine or anticipated intermittent releases at nuclear power stations. It operates in a batch input mode and has various options a user may select. Relative atmospheric dispersion and deposition factors are computed for 22 specific distances out to 50 miles from the site for each directional sector. From these results, values for 10 distance segments are computed. The user may also select other locations for which atmospheric dispersion deposition factors are computed. Program features, including required input data and output results, are described. A program listing and test case data input and resulting output are provided

  16. RESEARCH OF INFLUENCE OF COMPUTER TRAINING OF FUTURE LAWYERS ON INDICATORS OF ACADEMIC ACHIEVEMENT

    Directory of Open Access Journals (Sweden)

    M.I. Sherman

    2014-06-01

    Full Text Available The article devoted to a research of influence of progress in Informatics and relative to it disciplines to the showings of educational progress. It has been analyzed the scientific attitude to the definition of term «educational progress». It has been substantiated that computer and information competency of future lawyers is not only a partial of professional activities but also a powerful tool of information tasks with educational character solving by a student in the process of his professional training in the university, that is displayed by the readings of educational progress. During the research we have received the value of coefficient of correlation between the values of final marks in the disciplines Informatics, Legal information retrieval systems, Legal statistics and the results of end-of-semester exams from the first to the forth course, the average score and quality coefficient of students' education progress in the control and experimental groups. During the research of influence of level of formedness of computer and information competency of future lawyers on the readings of the final progress in the educational subjects it has been established that: the level of formedness of base component of computer and information competence, that is provided by the learning of Informatics discipline on the information stage of working of system of professional computer and information training have positive influence on the showings of students' final progress in education subjects. This process is more effective in the experimental groups and to a greater extent it influences the coefficient of final progress quality than its average score; this influence especially visible is on the social and economic and humanitarian disciplines, it is confirmed by value of calculated coefficients of correlation; at the axiological stage of system function of professional computer and informational training the maximum values of the coefficients of

  17. Computation Directorate 2008 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, D L

    2009-03-25

    Whether a computer is simulating the aging and performance of a nuclear weapon, the folding of a protein, or the probability of rainfall over a particular mountain range, the necessary calculations can be enormous. Our computers help researchers answer these and other complex problems, and each new generation of system hardware and software widens the realm of possibilities. Building on Livermore's historical excellence and leadership in high-performance computing, Computation added more than 331 trillion floating-point operations per second (teraFLOPS) of power to LLNL's computer room floors in 2008. In addition, Livermore's next big supercomputer, Sequoia, advanced ever closer to its 2011-2012 delivery date, as architecture plans and the procurement contract were finalized. Hyperion, an advanced technology cluster test bed that teams Livermore with 10 industry leaders, made a big splash when it was announced during Michael Dell's keynote speech at the 2008 Supercomputing Conference. The Wall Street Journal touted Hyperion as a 'bright spot amid turmoil' in the computer industry. Computation continues to measure and improve the costs of operating LLNL's high-performance computing systems by moving hardware support in-house, by measuring causes of outages to apply resources asymmetrically, and by automating most of the account and access authorization and management processes. These improvements enable more dollars to go toward fielding the best supercomputers for science, while operating them at less cost and greater responsiveness to the customers.

  18. Requiring students to have computers: questions for consideration.

    Science.gov (United States)

    McAuley, R J

    1998-06-01

    For the past several years a dialogue has been taking place in the offices, lounges, and meeting rooms of medical schools about whether medical students should be required to bring or purchase computers when they enter school. Microcomputers offer educators a unique opportunity to provide students with access to computer-assisted instruction, asynchronous communication, and extensive knowledge bases. However, there is still no evidence attesting to the effectiveness of computers as teaching or learning tools in medical education. The author raises questions that schools need to consider before requiring students to own computers: What kind of computer best suits their needs? What might impede using computers to teach? And who is currently requiring computers? In addressing the last question, the author presents information about 15 North American schools that currently require their students to have computers, reporting each school's software and hardware requirements; how each expects students to use the computers; and who covers the cost of the computers (the students or the school). Finally, he argues that major institutional commitment is needed for computers to be successfully integrated into any medical school curriculum.

  19. Design and implementation of distributed spatial computing node based on WPS

    International Nuclear Information System (INIS)

    Liu, Liping; Li, Guoqing; Xie, Jibo

    2014-01-01

    Currently, the research work of SIG (Spatial Information Grid) technology mostly emphasizes on the spatial data sharing in grid environment, while the importance of spatial computing resources is ignored. In order to implement the sharing and cooperation of spatial computing resources in grid environment, this paper does a systematical research of the key technologies to construct Spatial Computing Node based on the WPS (Web Processing Service) specification by OGC (Open Geospatial Consortium). And a framework of Spatial Computing Node is designed according to the features of spatial computing resources. Finally, a prototype of Spatial Computing Node is implemented and the relevant verification work under the environment is completed

  20. Photovoltaic subsystem marketing and distribution model: programming manual. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1982-07-01

    Complete documentation of the marketing and distribution (M and D) computer model is provided. The purpose is to estimate the costs of selling and transporting photovoltaic solar energy products from the manufacturer to the final customer. The model adjusts for the inflation and regional differences in marketing and distribution costs. The model consists of three major components: the marketing submodel, the distribution submodel, and the financial submodel. The computer program is explained including the input requirements, output reports, subprograms and operating environment. The program specifications discuss maintaining the validity of the data and potential improvements. An example for a photovoltaic concentrator collector demonstrates the application of the model.

  1. Experience of final examination for master's degree in optical engineering

    Science.gov (United States)

    Ivanova, Tatiana; Ezhova, Kseniia; Bakholdin, Alexey; Tolstoba, Nadezhda; Romanova, Galina

    2015-10-01

    At the end of master program it is necessary to measure students' knowledge and competences. Master thesis is the one way, but it measure deep knowledge in quite narrow area. Another way of measure is additional final examination that includes topics from the most important courses. In Applied and Computer Optics Department of ITMO University such examination includes theoretical questions and practical tasks from several courses in one examination. Theoretical section of examination is written and second section is practical. Practical section takes place in laboratory with real equipment or with computer simulation. In the paper examples of tasks for master programs, and results of examination are presented.

  2. EXPERIMENTS AND COMPUTATIONAL MODELING OF PULVERIZED-COAL IGNITION; FINAL

    International Nuclear Information System (INIS)

    Samuel Owusu-Ofori; John C. Chen

    1999-01-01

    Under typical conditions of pulverized-coal combustion, which is characterized by fine particles heated at very high rates, there is currently a lack of certainty regarding the ignition mechanism of bituminous and lower rank coals as well as the ignition rate of reaction. furthermore, there have been no previous studies aimed at examining these factors under various experimental conditions, such as particle size, oxygen concentration, and heating rate. Finally, there is a need to improve current mathematical models of ignition to realistically and accurately depict the particle-to-particle variations that exist within a coal sample. Such a model is needed to extract useful reaction parameters from ignition studies, and to interpret ignition data in a more meaningful way. The authors propose to examine fundamental aspects of coal ignition through (1) experiments to determine the ignition temperature of various coals by direct measurement, and (2) modeling of the ignition process to derive rate constants and to provide a more insightful interpretation of data from ignition experiments. The authors propose to use a novel laser-based ignition experiment to achieve their first objective. Laser-ignition experiments offer the distinct advantage of easy optical access to the particles because of the absence of a furnace or radiating walls, and thus permit direct observation and particle temperature measurement. The ignition temperature of different coals under various experimental conditions can therefore be easily determined by direct measurement using two-color pyrometry. The ignition rate-constants, when the ignition occurs heterogeneously, and the particle heating rates will both be determined from analyses based on these measurements

  3. Computational techniques in tribology and material science at the atomic level

    Science.gov (United States)

    Ferrante, J.; Bozzolo, G. H.

    1992-01-01

    Computations in tribology and material science at the atomic level present considerable difficulties. Computational techniques ranging from first-principles to semi-empirical and their limitations are discussed. Example calculations of metallic surface energies using semi-empirical techniques are presented. Finally, application of the methods to calculation of adhesion and friction are presented.

  4. Parallel diffusion calculation for the PHAETON on-line multiprocessor computer

    International Nuclear Information System (INIS)

    Collart, J.M.; Fedon-Magnaud, C.; Lautard, J.J.

    1987-04-01

    The aim of the PHAETON project is the design of an on-line computer in order to increase the immediate knowledge of the main operating and safety parameters in power plants. A significant stage is the computation of the three dimensional flux distribution. For cost and safety reason a computer based on a parallel microprocessor architecture has been studied. This paper presents a first approach to parallelized three dimensional diffusion calculation. A computing software has been written and built in a four processors demonstrator. We present the realization in progress, concerning the final equipment. 8 refs

  5. Edge computing technologies for Internet of Things: a primer

    Directory of Open Access Journals (Sweden)

    Yuan Ai

    2018-04-01

    Full Text Available With the rapid development of mobile internet and Internet of Things applications, the conventional centralized cloud computing is encountering severe challenges, such as high latency, low Spectral Efficiency (SE, and non-adaptive machine type of communication. Motivated to solve these challenges, a new technology is driving a trend that shifts the function of centralized cloud computing to edge devices of networks. Several edge computing technologies originating from different backgrounds to decrease latency, improve SE, and support the massive machine type of communication have been emerging. This paper comprehensively presents a tutorial on three typical edge computing technologies, namely mobile edge computing, cloudlets, and fog computing. In particular, the standardization efforts, principles, architectures, and applications of these three technologies are summarized and compared. From the viewpoint of radio access network, the differences between mobile edge computing and fog computing are highlighted, and the characteristics of fog computing-based radio access network are discussed. Finally, open issues and future research directions are identified as well. Keywords: Internet of Things (IoT, Mobile edge computing, Cloudlets, Fog computing

  6. A new stereotactic apparatus guided by computed tomography

    International Nuclear Information System (INIS)

    Huk, W.J.

    1981-01-01

    The accurate information provided by computer tomography about existence, shape, and localization of intracranial neoplasms in an early phase and in inaccessible regions have improved the diagnostics greatly, so that these lie far ahead of the therapeutic possibilities for brain tumors. To reduce this wide margin we have developed a new targeting device which makes a stereotactic approach to central lesions under sight-control by computed tomography within the computed tomography-scanner possible. With the help of this simple device we are now able to perform stereotactic procedures for tumor biopsy guided by computed tomography, needling and drainage of abscesses and cysts, and finally for the implantation of radioactive material for the interstitial radiotherapy of inoperable cysts and tumors. (orig.) [de

  7. A Monte Carlo program for generating hadronic final states

    International Nuclear Information System (INIS)

    Angelini, L.; Pellicoro, M.; Nitti, L.; Preparata, G.; Valenti, G.

    1991-01-01

    FIRST is a computer program to generate final states from high energy hadronic interactions using the Monte Carlo technique. It is based on a theoretical model in which the high degree of universality in such interactions is related with the existence of highly excited quark-antiquark bound states, called fire-strings. The program handles the decay of both fire-strings and unstable particles produced in the intermediate states. (orig.)

  8. Final Report for 'Implimentation and Evaluation of Multigrid Linear Solvers into Extended Magnetohydrodynamic Codes for Petascale Computing'

    International Nuclear Information System (INIS)

    Vadlamani, Srinath; Kruger, Scott; Austin, Travis

    2008-01-01

    Extended magnetohydrodynamic (MHD) codes are used to model the large, slow-growing instabilities that are projected to limit the performance of International Thermonuclear Experimental Reactor (ITER). The multiscale nature of the extended MHD equations requires an implicit approach. The current linear solvers needed for the implicit algorithm scale poorly because the resultant matrices are so ill-conditioned. A new solver is needed, especially one that scales to the petascale. The most successful scalable parallel processor solvers to date are multigrid solvers. Applying multigrid techniques to a set of equations whose fundamental modes are dispersive waves is a promising solution to CEMM problems. For the Phase 1, we implemented multigrid preconditioners from the HYPRE project of the Center for Applied Scientific Computing at LLNL via PETSc of the DOE SciDAC TOPS for the real matrix systems of the extended MHD code NIMROD which is a one of the primary modeling codes of the OFES-funded Center for Extended Magnetohydrodynamic Modeling (CEMM) SciDAC. We implemented the multigrid solvers on the fusion test problem that allows for real matrix systems with success, and in the process learned about the details of NIMROD data structures and the difficulties of inverting NIMROD operators. The further success of this project will allow for efficient usage of future petascale computers at the National Leadership Facilities: Oak Ridge National Laboratory, Argonne National Laboratory, and National Energy Research Scientific Computing Center. The project will be a collaborative effort between computational plasma physicists and applied mathematicians at Tech-X Corporation, applied mathematicians Front Range Scientific Computations, Inc. (who are collaborators on the HYPRE project), and other computational plasma physicists involved with the CEMM project.

  9. GATE: Improving the computational efficiency

    International Nuclear Information System (INIS)

    Staelens, S.; De Beenhouwer, J.; Kruecker, D.; Maigne, L.; Rannou, F.; Ferrer, L.; D'Asseler, Y.; Buvat, I.; Lemahieu, I.

    2006-01-01

    GATE is a software dedicated to Monte Carlo simulations in Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET). An important disadvantage of those simulations is the fundamental burden of computation time. This manuscript describes three different techniques in order to improve the efficiency of those simulations. Firstly, the implementation of variance reduction techniques (VRTs), more specifically the incorporation of geometrical importance sampling, is discussed. After this, the newly designed cluster version of the GATE software is described. The experiments have shown that GATE simulations scale very well on a cluster of homogeneous computers. Finally, an elaboration on the deployment of GATE on the Enabling Grids for E-Science in Europe (EGEE) grid will conclude the description of efficiency enhancement efforts. The three aforementioned methods improve the efficiency of GATE to a large extent and make realistic patient-specific overnight Monte Carlo simulations achievable

  10. An overview of computer-based natural language processing

    Science.gov (United States)

    Gevarter, W. B.

    1983-01-01

    Computer based Natural Language Processing (NLP) is the key to enabling humans and their computer based creations to interact with machines in natural language (like English, Japanese, German, etc., in contrast to formal computer languages). The doors that such an achievement can open have made this a major research area in Artificial Intelligence and Computational Linguistics. Commercial natural language interfaces to computers have recently entered the market and future looks bright for other applications as well. This report reviews the basic approaches to such systems, the techniques utilized, applications, the state of the art of the technology, issues and research requirements, the major participants and finally, future trends and expectations. It is anticipated that this report will prove useful to engineering and research managers, potential users, and others who will be affected by this field as it unfolds.

  11. Input/output routines for a hybrid computer

    International Nuclear Information System (INIS)

    Izume, Akitada; Yodo, Terutaka; Sakama, Iwao; Sakamoto, Akira; Miyake, Osamu

    1976-05-01

    This report is concerned with data processing programs for a hybrid computer system. Especially pre-data processing of magnetic tapes which are recorded during the dynamic experiment by FACOM 270/25 data logging system in the 50 MW steam generator test facility is described in detail. The magnetic tape is a most effective recording medium for data logging, but recording formats of the magnetic tape are different between data logging systems. In our section, the final data analyses are performed by data in the disk of EAI-690 hybrid computer system, and to transfer all required information in magnetic tapes to the disk, the magnetic tape editing and data transit are necessary by sub-computer NEAC-3200 system. This report is written for users as a manual and reference hand book of pre-data processing between different type computers. (auth.)

  12. Examinations in the Final Year of Transition to Mathematical Methods Computer Algebra System (CAS)

    Science.gov (United States)

    Leigh-Lancaster, David; Les, Magdalena; Evans, Michael

    2010-01-01

    2009 was the final year of parallel implementation for Mathematical Methods Units 3 and 4 and Mathematical Methods (CAS) Units 3 and 4. From 2006-2009 there was a common technology-free short answer examination that covered the same function, algebra, calculus and probability content for both studies with corresponding expectations for key…

  13. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-01-01

    Finally, we take a step further by developing a novel feature selection method suitable for defining a computational framework capable of analyzing the genomic content of enhancers and reporting cell-line specific predictive signatures.

  14. Integrating user studies into computer graphics-related courses.

    Science.gov (United States)

    Santos, B S; Dias, P; Silva, S; Ferreira, C; Madeira, J

    2011-01-01

    This paper presents computer graphics. Computer graphics and visualization are essentially about producing images for a target audience, be it the millions watching a new CG-animated movie or the small group of researchers trying to gain insight into the large amount of numerical data resulting from a scientific experiment. To ascertain the final images' effectiveness for their intended audience or the designed visualizations' accuracy and expressiveness, formal user studies are often essential. In human-computer interaction (HCI), such user studies play a similar fundamental role in evaluating the usability and applicability of interaction methods and metaphors for the various devices and software systems we use.

  15. Eighth SIAM conference on parallel processing for scientific computing: Final program and abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This SIAM conference is the premier forum for developments in parallel numerical algorithms, a field that has seen very lively and fruitful developments over the past decade, and whose health is still robust. Themes for this conference were: combinatorial optimization; data-parallel languages; large-scale parallel applications; message-passing; molecular modeling; parallel I/O; parallel libraries; parallel software tools; parallel compilers; particle simulations; problem-solving environments; and sparse matrix computations.

  16. Computed tomography

    International Nuclear Information System (INIS)

    Wells, P.; Davis, J.; Morgan, M.

    1994-01-01

    X-ray or gamma-ray transmission computed tomography (CT) is a powerful non-destructive evaluation (NDE) technique that produces two-dimensional cross-sectional images of an object without the need to physically section it. CT is also known by the acronym CAT, for computerised axial tomography. This review article presents a brief historical perspective on CT, its current status and the underlying physics. The mathematical fundamentals of computed tomography are developed for the simplest transmission CT modality. A description of CT scanner instrumentation is provided with an emphasis on radiation sources and systems. Examples of CT images are shown indicating the range of materials that can be scanned and the spatial and contrast resolutions that may be achieved. Attention is also given to the occurrence, interpretation and minimisation of various image artefacts that may arise. A final brief section is devoted to the principles and potential of a range of more recently developed tomographic modalities including diffraction CT, positron emission CT and seismic tomography. 57 refs., 2 tabs., 14 figs

  17. Computer control of fuel handling activities at FFTF

    International Nuclear Information System (INIS)

    Romrell, D.M.

    1985-03-01

    The Fast Flux Test Facility near Richland, Washington, utilizes computer control for reactor refueling and other related core component handling and processing tasks. The computer controlled tasks described in this paper include core component transfers within the reactor vessel, core component transfers into and out of the reactor vessel, remote duct measurements of irradiated core components, remote duct cutting, and finally, transferring irradiated components out of the reactor containment building for off-site shipments or to long term storage. 3 refs., 16 figs

  18. Evaluating computer program performance on the CRAY-1

    International Nuclear Information System (INIS)

    Rudsinski, L.; Pieper, G.W.

    1979-01-01

    The Advanced Scientific Computers Project of Argonne's Applied Mathematics Division has two objectives: to evaluate supercomputers and to determine their effect on Argonne's computing workload. Initial efforts have focused on the CRAY-1, which is the only advanced computer currently available. Users from seven Argonne divisions executed test programs on the CRAY and made performance comparisons with the IBM 370/195 at Argonne. This report describes these experiences and discusses various techniques for improving run times on the CRAY. Direct translations of code from scalar to vector processor reduced running times as much as two-fold, and this reduction will become more pronounced as the CRAY compiler is developed. Further improvement (two- to ten-fold) was realized by making minor code changes to facilitate compiler recognition of the parallel and vector structure within the programs. Finally, extensive rewriting of the FORTRAN code structure reduced execution times dramatically, in three cases by a factor of more than 20; and even greater reduction should be possible by changing algorithms within a production code. It is condluded that the CRAY-1 would be of great benefit to Argonne researchers. Existing codes could be modified with relative ease to run significantly faster than on the 370/195. More important, the CRAY would permit scientists to investigate complex problems currently deemed infeasibile on traditional scalar machines. Finally, an interface between the CRAY-1 and IBM computers such as the 370/195, scheduled by Cray Research for the first quarter of 1979, would considerably facilitate the task of integrating the CRAY into Argonne's Central Computing Facility. 13 tables

  19. Accelerating Astronomy & Astrophysics in the New Era of Parallel Computing: GPUs, Phi and Cloud Computing

    Science.gov (United States)

    Ford, Eric B.; Dindar, Saleh; Peters, Jorg

    2015-08-01

    The realism of astrophysical simulations and statistical analyses of astronomical data are set by the available computational resources. Thus, astronomers and astrophysicists are constantly pushing the limits of computational capabilities. For decades, astronomers benefited from massive improvements in computational power that were driven primarily by increasing clock speeds and required relatively little attention to details of the computational hardware. For nearly a decade, increases in computational capabilities have come primarily from increasing the degree of parallelism, rather than increasing clock speeds. Further increases in computational capabilities will likely be led by many-core architectures such as Graphical Processing Units (GPUs) and Intel Xeon Phi. Successfully harnessing these new architectures, requires significantly more understanding of the hardware architecture, cache hierarchy, compiler capabilities and network network characteristics.I will provide an astronomer's overview of the opportunities and challenges provided by modern many-core architectures and elastic cloud computing. The primary goal is to help an astronomical audience understand what types of problems are likely to yield more than order of magnitude speed-ups and which problems are unlikely to parallelize sufficiently efficiently to be worth the development time and/or costs.I will draw on my experience leading a team in developing the Swarm-NG library for parallel integration of large ensembles of small n-body systems on GPUs, as well as several smaller software projects. I will share lessons learned from collaborating with computer scientists, including both technical and soft skills. Finally, I will discuss the challenges of training the next generation of astronomers to be proficient in this new era of high-performance computing, drawing on experience teaching a graduate class on High-Performance Scientific Computing for Astrophysics and organizing a 2014 advanced summer

  20. A Multimedia Tutorial for Charged-Particle Beam Dynamics. Final report

    International Nuclear Information System (INIS)

    Silbar, Richard R.

    1999-01-01

    In September 1995 WhistleSoft, Inc., began developing a computer-based multimedia tutorial for charged-particle beam dynamics under Phase II of a Small Business Innovative Research grant from the U.S. Department of Energy. In Phase I of this project (see its Final Report) we had developed several prototype multimedia modules using an authoring system on NeXTStep computers. Such a platform was never our intended target, and when we began Phase II we decided to make the change immediately to develop our tutorial modules for the Windows and Macintosh microcomputer market. This Report details our progress and accomplishments. It also gives a flavor of the look and feel of the presently available and upcoming modules

  1. A Multimedia Tutorial for Charged-Particle Beam Dynamics. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Silbar, Richard R.

    1999-07-26

    In September 1995 WhistleSoft, Inc., began developing a computer-based multimedia tutorial for charged-particle beam dynamics under Phase II of a Small Business Innovative Research grant from the U.S. Department of Energy. In Phase I of this project (see its Final Report) we had developed several prototype multimedia modules using an authoring system on NeXTStep computers. Such a platform was never our intended target, and when we began Phase II we decided to make the change immediately to develop our tutorial modules for the Windows and Macintosh microcomputer market. This Report details our progress and accomplishments. It also gives a flavor of the look and feel of the presently available and upcoming modules.

  2. Deep Learning for Computer Vision: A Brief Review

    Science.gov (United States)

    Doulamis, Nikolaos; Doulamis, Anastasios; Protopapadakis, Eftychios

    2018-01-01

    Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein. PMID:29487619

  3. Deep Learning for Computer Vision: A Brief Review

    Directory of Open Access Journals (Sweden)

    Athanasios Voulodimos

    2018-01-01

    Full Text Available Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein.

  4. Deep Learning for Computer Vision: A Brief Review.

    Science.gov (United States)

    Voulodimos, Athanasios; Doulamis, Nikolaos; Doulamis, Anastasios; Protopapadakis, Eftychios

    2018-01-01

    Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein.

  5. The use of symbolic computation in radiative, energy, and neutron transport calculations. Final report

    International Nuclear Information System (INIS)

    Frankel, J.I.

    1997-01-01

    This investigation used sysmbolic manipulation in developing analytical methods and general computational strategies for solving both linear and nonlinear, regular and singular integral and integro-differential equations which appear in radiative and mixed-mode energy transport. Contained in this report are seven papers which present the technical results as individual modules

  6. Computational Astrophysics Consortium 3 - Supernovae, Gamma-Ray Bursts and Nucleosynthesis

    Energy Technology Data Exchange (ETDEWEB)

    Woosley, Stan [Univ. of California, Santa Cruz, CA (United States)

    2014-08-29

    Final project report for UCSC's participation in the Computational Astrophysics Consortium - Supernovae, Gamma-Ray Bursts and Nucleosynthesis. As an appendix, the report of the entire Consortium is also appended.

  7. Project Final Report: Ubiquitous Computing and Monitoring System (UCoMS) for Discovery and Management of Energy Resources

    Energy Technology Data Exchange (ETDEWEB)

    Tzeng, Nian-Feng; White, Christopher D.; Moreman, Douglas

    2012-07-14

    The UCoMS research cluster has spearheaded three research areas since August 2004, including wireless and sensor networks, Grid computing, and petroleum applications. The primary goals of UCoMS research are three-fold: (1) creating new knowledge to push forward the technology forefronts on pertinent research on the computing and monitoring aspects of energy resource management, (2) developing and disseminating software codes and toolkits for the research community and the public, and (3) establishing system prototypes and testbeds for evaluating innovative techniques and methods. Substantial progress and diverse accomplishment have been made by research investigators in their respective areas of expertise cooperatively on such topics as sensors and sensor networks, wireless communication and systems, computational Grids, particularly relevant to petroleum applications.

  8. Informatic parcellation of the network involved in the computation of subjective value

    Science.gov (United States)

    Rangel, Antonio

    2014-01-01

    Understanding how the brain computes value is a basic question in neuroscience. Although individual studies have driven this progress, meta-analyses provide an opportunity to test hypotheses that require large collections of data. We carry out a meta-analysis of a large set of functional magnetic resonance imaging studies of value computation to address several key questions. First, what is the full set of brain areas that reliably correlate with stimulus values when they need to be computed? Second, is this set of areas organized into dissociable functional networks? Third, is a distinct network of regions involved in the computation of stimulus values at decision and outcome? Finally, are different brain areas involved in the computation of stimulus values for different reward modalities? Our results demonstrate the centrality of ventromedial prefrontal cortex (VMPFC), ventral striatum and posterior cingulate cortex (PCC) in the computation of value across tasks, reward modalities and stages of the decision-making process. We also find evidence of distinct subnetworks of co-activation within VMPFC, one involving central VMPFC and dorsal PCC and another involving more anterior VMPFC, left angular gyrus and ventral PCC. Finally, we identify a posterior-to-anterior gradient of value representations corresponding to concrete-to-abstract rewards. PMID:23887811

  9. Quantum Accelerators for High-performance Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S. [ORNL; Britt, Keith A. [ORNL; Mohiyaddin, Fahd A. [ORNL

    2017-11-01

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantum-accelerator framework that uses specialized kernels to offload select workloads while integrating with existing computing infrastructure. We elaborate on the role of the host operating system to manage these unique accelerator resources, the prospects for deploying quantum modules, and the requirements placed on the language hierarchy connecting these different system components. We draw on recent advances in the modeling and simulation of quantum computing systems with the development of architectures for hybrid high-performance computing systems and the realization of software stacks for controlling quantum devices. Finally, we present simulation results that describe the expected system-level behavior of high-performance computing systems composed from compute nodes with quantum processing units. We describe performance for these hybrid systems in terms of time-to-solution, accuracy, and energy consumption, and we use simple application examples to estimate the performance advantage of quantum acceleration.

  10. Final Project Report: Data Locality Enhancement of Dynamic Simulations for Exascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Xipeng [North Carolina State Univ., Raleigh, NC (United States)

    2016-04-27

    The goal of this project is to develop a set of techniques and software tools to enhance the matching between memory accesses in dynamic simulations and the prominent features of modern and future manycore systems, alleviating the memory performance issues for exascale computing. In the first three years, the PI and his group have achieves some significant progress towards the goal, producing a set of novel techniques for improving the memory performance and data locality in manycore systems, yielding 18 conference and workshop papers and 4 journal papers and graduating 6 Ph.Ds. This report summarizes the research results of this project through that period.

  11. Consolidation of cloud computing in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00224309; The ATLAS collaboration; Cordeiro, Cristovao; Hover, John; Kouba, Tomas; Love, Peter; Mcnab, Andrew; Schovancova, Jaroslava; Sobie, Randall; Giordano, Domenico

    2017-01-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in resp...

  12. A Multidisciplinary Research Team Approach to Computer-Aided Drafting (CAD) System Selection. Final Report.

    Science.gov (United States)

    Franken, Ken; And Others

    A multidisciplinary research team was assembled to review existing computer-aided drafting (CAD) systems for the purpose of enabling staff in the Design Drafting Department at Linn Technical College (Missouri) to select the best system out of the many CAD systems in existence. During the initial stage of the evaluation project, researchers…

  13. Solid-state nuclear-spin quantum computer based on magnetic resonance force microscopy

    International Nuclear Information System (INIS)

    Berman, G. P.; Doolen, G. D.; Hammel, P. C.; Tsifrinovich, V. I.

    2000-01-01

    We propose a nuclear-spin quantum computer based on magnetic resonance force microscopy (MRFM). It is shown that an MRFM single-electron spin measurement provides three essential requirements for quantum computation in solids: (a) preparation of the ground state, (b) one- and two-qubit quantum logic gates, and (c) a measurement of the final state. The proposed quantum computer can operate at temperatures up to 1 K. (c) 2000 The American Physical Society

  14. A look back: 57 years of scientific computing

    DEFF Research Database (Denmark)

    Wasniewski, Jerzy

    2012-01-01

    This document outlines my 57-year career in computational mathematics, a career that took me from Poland to Canada and finally to Denmark. It of course spans a period in which both hardware and software developed enormously. Along the way I was fortunate to be faced with fascinating technical cha...... challenges and privileged to be able to share them with inspiring colleagues. From the beginning, my work to a great extent was concerned, directly or indirectly, with computational linear algebra, an interest I maintain even today....

  15. Scientific computer simulation review

    International Nuclear Information System (INIS)

    Kaizer, Joshua S.; Heller, A. Kevin; Oberkampf, William L.

    2015-01-01

    Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework

  16. A STATISTICAL ANALYSIS OF GDP AND FINAL CONSUMPTION USING SIMPLE LINEAR REGRESSION. THE CASE OF ROMANIA 1990–2010

    OpenAIRE

    Aniela Balacescu; Marian Zaharia

    2011-01-01

    This paper aims to examine the causal relationship between GDP and final consumption. The authors used linear regression model in which GDP is considered variable results, and final consumption variable factor. In drafting article we used Excel software application that is a modern computing and statistical data analysis.

  17. 1994 CERN school of computing. Proceedings

    International Nuclear Information System (INIS)

    Vandoni, C.E.; Verkerk, C.

    1995-01-01

    These Proceedings contain a written account of the majority of the lectures given at the 1994 CERN School of Computing. A number of lectures dealt with general aspects of computing, in particular in the areas of high performance computing in embedded systems, distributed and heterogeneous computing, multimedia information systems and on the impact of computing on High Energy Physics. Modelling and Simulation were treated with emphasis on Statistical and High Energy Physics, and a simulation package (GEANT) and its future development were presented in detail. Hardware aspects were presented, in particular in the areas of massively parallel associative string proccesors CISC vs RISC processor architectures, and a summary of an analogic supercomputer chip architecture was given. The software development process and associated technologies were the subject of full presentations. Software for Data Acquisition Systems was discussed in a number of lectures. We also reproduce, as an appendix, a set of self-explanatory transparencies used by one lecturer in a particularly detailed presentation of this subject. The H1 trigger system was presented in detail. Finally, lectures were given on a parallel program supervisor and parallel language processing generation. (orig.)

  18. The Antares computing model

    Energy Technology Data Exchange (ETDEWEB)

    Kopper, Claudio, E-mail: claudio.kopper@nikhef.nl [NIKHEF, Science Park 105, 1098 XG Amsterdam (Netherlands)

    2013-10-11

    Completed in 2008, Antares is now the largest water Cherenkov neutrino telescope in the Northern Hemisphere. Its main goal is to detect neutrinos from galactic and extra-galactic sources. Due to the high background rate of atmospheric muons and the high level of bioluminescence, several on-line and off-line filtering algorithms have to be applied to the raw data taken by the instrument. To be able to handle this data stream, a dedicated computing infrastructure has been set up. The paper covers the main aspects of the current official Antares computing model. This includes an overview of on-line and off-line data handling and storage. In addition, the current usage of the “IceTray” software framework for Antares data processing is highlighted. Finally, an overview of the data storage formats used for high-level analysis is given.

  19. Staging with computed tomography of patients with colon cancer

    DEFF Research Database (Denmark)

    Malmstrom, M. L.; Brisling, S.; Klausen, T. W.

    2018-01-01

    Purpose Accurate staging of colonic cancer is important for patient stratification. We aimed to correlate the diagnostic accuracy of preoperative computed tomography (CT) with final histopathology as reference standard. Methods Data was collected retrospectively on 615 consecutive patients operated...

  20. [Towards computer-aided catalyst design: Three effective core potential studies of C-H activation]. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    Research in the initial grant period focused on computational studies relevant to the selective activation of methane, the prime component of natural gas. Reaction coordinates for methane activation by experimental models were delineated, as well as the bonding and structure of complexes that effect this important reaction. This research, highlighted in the following sections, also provided the impetus for further development, and application of methods for modeling metal-containing catalysts. Sections of the report describe the following: methane activation by multiple-bonded transition metal complexes; computational lanthanide chemistry; and methane activation by non-imido, multiple-bonded ligands.

  1. Electroproduction of associated two-body final states

    International Nuclear Information System (INIS)

    Harding, D.J.

    1983-01-01

    The Large Aperture Magnet Experiment at the Cornell Electron Synchrotron measured electron scattering in the region 2.98 2 and 0.5 2 2 . The 11.5 GeV extracted electron beam struck a liquid hydrogen target in an eight kilogauss magnetic field. The charged particles in the final state were tracked through the field by a multiwire proportional chamber system of 34 planes. A lead-scintillator shower counter triggered the experiment on detection of a scattered electron. Time-of-flight and water Cherenkov counters identified some of the final state hadrons. The data recorded on tape was then passed through computer programs which linked proportional chamber strikes into tracks, fit momenta to the tracks, applied particle identification algorithms, selected interesting events, and plotted histograms of invariant masses. All of this is described here in detail, with special attention to the front-end electronics and the track-finding program. Many specific final states were observed. The analysis presented here concentrates on the reaction γ/sub v/p→pπ + ππ 0 , with the final hadrons resulting from the decay of a two-body state. The states pω 0 and p eta 0 are measured. Limits are set for the production of Δ + + rho - , Δ + rho 0 , and Δp + . The conclusion the author draws is that hadron-like two-body processes are almost completely absent in virtual photon scattering in this kinematic region. Vector meson production, excitation of the nucleons, and the scattering of the photons directly from individual partons are the important processes

  2. Final Report Extreme Computing and U.S. Competitiveness DOE Award. DE-FG02-11ER26087/DE-SC0008764

    Energy Technology Data Exchange (ETDEWEB)

    Mustain, Christopher J. [Council on Competitiveness, Washington, DC (United States)

    2016-01-13

    The Council has acted on each of the grant deliverables during the funding period. The deliverables are: (1) convening the Council’s High Performance Computing Advisory Committee (HPCAC) on a bi-annual basis; (2) broadening public awareness of high performance computing (HPC) and exascale developments; (3) assessing the industrial applications of extreme computing; and (4) establishing a policy and business case for an exascale economy.

  3. An introduction to geometric computation

    International Nuclear Information System (INIS)

    Nievergelt, J.

    1991-01-01

    Computational geometry has some appealing features that make it ideal for learning about algorithms and data structures, namely the problem statements are easily understood, intuitively meaningful, and mathematically rigorous, problem statement, solution, and every step of the construction have natural visual representations that support abstract thinking and help in detecting errors of reasoning, and finally, these algorithms are practical because is easy to come up with examples where they can be applied. Figs

  4. Reliability analysis and computation of computer-based safety instrumentation and control used in German nuclear power plant. Final report; Zuverlaessigkeitsuntersuchung und -berechnung rechnerbasierter Sicherheitsleittechnik zum Einsatz in deutschen Kernkraftwerken. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Yongjian [Hochschule Magdeburg-Stendal, Magdeburg (Germany). Inst. fuer Elektrotechnik; Krause, Ulrich [Magdeburg Univ. (Germany). Inst. fuer Apparate- und Umwelttechnik; Gu, Chunlei

    2014-08-21

    The trend of technological advancement in the field of safety instrumentation and control (I and C) leads to increasingly frequent use of computer-based (digital) control systems which consisting of distributed, connected bus communications computers and their functionalities are freely programmable by qualified software. The advantages of the new I and C system over the old I and C system with hard-wired technology are e.g. in the higher flexibility, cost-effective procurement of spare parts, higher hardware reliability (through higher integration density, intelligent self-monitoring mechanisms, etc.). On the other hand, skeptics see the new technology with the computer-based I and C a higher potential by influences of common cause failures (CCF), and the easier manipulation by sabotage (IT Security). In this joint research project funded by the Federal Ministry for Economical Affaires and Energy (BMWi) (2011-2014, FJZ 1501405) the Otto-von-Guericke-University Magdeburg and Magdeburg-Stendal University of Applied Sciences are therefore trying to develop suitable methods for the demonstration of the reliability of the new instrumentation and control systems with the focus on the investigation of CCF. This expertise of both houses shall be extended to this area and a scientific contribution to the sound reliability judgments of the digital safety I and C in domestic and foreign nuclear power plants. First, the state of science and technology will be worked out through the study of national and international standards in the field of functional safety of electrical and I and C systems and accompanying literature. On the basis of the existing nuclear Standards the deterministic requirements on the structure of the new digital I and C system will be determined. The possible methods of reliability modeling will be analyzed and compared. A suitable method called multi class binomial failure rate (MCFBR) which was successfully used in safety valve applications will be

  5. Computer Aided Drug Design: Success and Limitations.

    Science.gov (United States)

    Baig, Mohammad Hassan; Ahmad, Khurshid; Roy, Sudeep; Ashraf, Jalaluddin Mohammad; Adil, Mohd; Siddiqui, Mohammad Haris; Khan, Saif; Kamal, Mohammad Amjad; Provazník, Ivo; Choi, Inho

    2016-01-01

    Over the last few decades, computer-aided drug design has emerged as a powerful technique playing a crucial role in the development of new drug molecules. Structure-based drug design and ligand-based drug design are two methods commonly used in computer-aided drug design. In this article, we discuss the theory behind both methods, as well as their successful applications and limitations. To accomplish this, we reviewed structure based and ligand based virtual screening processes. Molecular dynamics simulation, which has become one of the most influential tool for prediction of the conformation of small molecules and changes in their conformation within the biological target, has also been taken into account. Finally, we discuss the principles and concepts of molecular docking, pharmacophores and other methods used in computer-aided drug design.

  6. Advances in Integrated Computational Materials Engineering "ICME"

    Science.gov (United States)

    Hirsch, Jürgen

    The methods of Integrated Computational Materials Engineering that were developed and successfully applied for Aluminium have been constantly improved. The main aspects and recent advances of integrated material and process modeling are simulations of material properties like strength and forming properties and for the specific microstructure evolution during processing (rolling, extrusion, annealing) under the influence of material constitution and process variations through the production process down to the final application. Examples are discussed for the through-process simulation of microstructures and related properties of Aluminium sheet, including DC ingot casting, pre-heating and homogenization, hot and cold rolling, final annealing. New results are included of simulation solution annealing and age hardening of 6xxx alloys for automotive applications. Physically based quantitative descriptions and computer assisted evaluation methods are new ICME methods of integrating new simulation tools also for customer applications, like heat affected zones in welding of age hardening alloys. The aspects of estimating the effect of specific elements due to growing recycling volumes requested also for high end Aluminium products are also discussed, being of special interest in the Aluminium producing industries.

  7. Exploiting Virtualization and Cloud Computing in ATLAS

    International Nuclear Information System (INIS)

    Harald Barreiro Megino, Fernando; Van der Ster, Daniel; Benjamin, Doug; De, Kaushik; Gable, Ian; Paterson, Michael; Taylor, Ryan; Hendrix, Val; Vitillo, Roberto A; Panitkin, Sergey; De Silva, Asoka; Walker, Rod

    2012-01-01

    The ATLAS Computing Model was designed around the concept of grid computing; since the start of data-taking, this model has proven very successful in the federated operation of more than one hundred Worldwide LHC Computing Grid (WLCG) sites for offline data distribution, storage, processing and analysis. However, new paradigms in computing, namely virtualization and cloud computing, present improved strategies for managing and provisioning IT resources that could allow ATLAS to more flexibly adapt and scale its storage and processing workloads on varied underlying resources. In particular, ATLAS is developing a “grid-of-clouds” infrastructure in order to utilize WLCG sites that make resources available via a cloud API. This work will present the current status of the Virtualization and Cloud Computing R and D project in ATLAS Distributed Computing. First, strategies for deploying PanDA queues on cloud sites will be discussed, including the introduction of a “cloud factory” for managing cloud VM instances. Next, performance results when running on virtualized/cloud resources at CERN LxCloud, StratusLab, and elsewhere will be presented. Finally, we will present the ATLAS strategies for exploiting cloud-based storage, including remote XROOTD access to input data, management of EC2-based files, and the deployment of cloud-resident LCG storage elements.

  8. Approximability of optimization problems through adiabatic quantum computation

    CERN Document Server

    Cruz-Santos, William

    2014-01-01

    The adiabatic quantum computation (AQC) is based on the adiabatic theorem to approximate solutions of the Schrödinger equation. The design of an AQC algorithm involves the construction of a Hamiltonian that describes the behavior of the quantum system. This Hamiltonian is expressed as a linear interpolation of an initial Hamiltonian whose ground state is easy to compute, and a final Hamiltonian whose ground state corresponds to the solution of a given combinatorial optimization problem. The adiabatic theorem asserts that if the time evolution of a quantum system described by a Hamiltonian is l

  9. Computation of Asteroid Proper Elements: Recent Advances

    Science.gov (United States)

    Knežević, Z.

    2017-12-01

    The recent advances in computation of asteroid proper elements are briefly reviewed. Although not representing real breakthroughs in computation and stability assessment of proper elements, these advances can still be considered as important improvements offering solutions to some practical problems encountered in the past. The problem of getting unrealistic values of perihelion frequency for very low eccentricity orbits is solved by computing frequencies using the frequency-modified Fourier transform. The synthetic resonant proper elements adjusted to a given secular resonance helped to prove the existence of Astraea asteroid family. The preliminary assessment of stability with time of proper elements computed by means of the analytical theory provides a good indication of their poorer performance with respect to their synthetic counterparts, and advocates in favor of ceasing their regular maintenance; the final decision should, however, be taken on the basis of more comprehensive and reliable direct estimate of their individual and sample average deviations from constancy.

  10. National energy peak leveling program (NEPLP). Final report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    This three-volume report is responsive to the requirements of Contract E (04-3)-1152 to provide a detailed methodology, to include management, technology, and socio-economic aspects, of a voluntary community program of computer-assisted peak load leveling and energy conservation in commercial community facilities. The demonstration project established proof-of-concept in reducing the kW-demand peak by the unofficial goal of 10%, with concurrent kWh savings. This section of the three volume report is a final report appendix with information on the National Energy Peak Leveling Program (NEPLP).

  11. Consolidation of cloud computing in ATLAS

    Science.gov (United States)

    Taylor, Ryan P.; Domingues Cordeiro, Cristovao Jose; Giordano, Domenico; Hover, John; Kouba, Tomas; Love, Peter; McNab, Andrew; Schovancova, Jaroslava; Sobie, Randall; ATLAS Collaboration

    2017-10-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems.

  12. Computing for magnetic fusion energy research: An updated vision

    International Nuclear Information System (INIS)

    Henline, P.; Giarrusso, J.; Davis, S.; Casper, T.

    1993-01-01

    This Fusion Computing Council perspective is written to present the primary of the fusion computing community at the time of publication of the report necessarily as a summary of the information contained in the individual sections. These concerns reflect FCC discussions during final review of contributions from the various working groups and portray our latest information. This report itself should be considered as dynamic, requiring periodic updating in an attempt to track rapid evolution of the computer industry relevant to requirements for magnetic fusion research. The most significant common concern among the Fusion Computing Council working groups is networking capability. All groups see an increasing need for network services due to the use of workstations, distributed computing environments, increased use of graphic services, X-window usage, remote experimental collaborations, remote data access for specific projects and other collaborations. Other areas of concern include support for workstations, enhanced infrastructure to support collaborations, the User Service Centers, NERSC and future massively parallel computers, and FCC sponsored workshops

  13. Control mechanism of double-rotator-structure ternary optical computer

    Science.gov (United States)

    Kai, SONG; Liping, YAN

    2017-03-01

    Double-rotator-structure ternary optical processor (DRSTOP) has two characteristics, namely, giant data-bits parallel computing and reconfigurable processor, which can handle thousands of data bits in parallel, and can run much faster than computers and other optical computer systems so far. In order to put DRSTOP into practical application, this paper established a series of methods, namely, task classification method, data-bits allocation method, control information generation method, control information formatting and sending method, and decoded results obtaining method and so on. These methods form the control mechanism of DRSTOP. This control mechanism makes DRSTOP become an automated computing platform. Compared with the traditional calculation tools, DRSTOP computing platform can ease the contradiction between high energy consumption and big data computing due to greatly reducing the cost of communications and I/O. Finally, the paper designed a set of experiments for DRSTOP control mechanism to verify its feasibility and correctness. Experimental results showed that the control mechanism is correct, feasible and efficient.

  14. Reviving a medical wearable computer for teaching purposes.

    Science.gov (United States)

    Frenger, Paul

    2014-01-01

    In 1978 the author constructed a medical wearable computer using an early CMOS microprocessor and support chips. This device was targeted for use by health-conscious consumers and other early adopters. Its expandable functions included weight management, blood pressure control, diabetes care, medication reminders, smoking cessation, pediatric growth and development, simple medical database, digital communication with a doctor’s office and emergency alert system. Various physiological sensors could be plugged-into the calculator-sized chassis. The device was shown to investor groups but funding was not obtained; by 1992 the author ceased pursuing it. The Computing and Mathematics Chair at a local University, a NASA acquaintance, approached the author to mentor a CS capstone course for Summer 2012. With the author’s guidance, five students proceeded to convert this medical wearable computer design to an iPhone-based implementation using the Apple Xcode Developer Kit and other utilities. The final student device contained a body mass index (BMI) calculator, an emergency alert for 911 or other first responders, a medication reminder, a Doctor’s appointment feature, a medical database, medical Internet links, and a pediatric growth & development guide. The students’ final imple-mentation was successfully demonstrated on an actual iPhone 4 at the CS capstone meeting in mid-Summer.

  15. Gender stereotypes, aggression, and computer games: an online survey of women.

    Science.gov (United States)

    Norris, Kamala O

    2004-12-01

    Computer games were conceptualized as a potential mode of entry into computer-related employment for women. Computer games contain increasing levels of realism and violence, as well as biased gender portrayals. It has been suggested that aggressive personality characteristics attract people to aggressive video games, and that more women do not play computer games because they are socialized to be non-aggressive. To explore gender identity and aggressive personality in the context of computers, an online survey was conducted on women who played computer games and women who used the computer but did not play computer games. Women who played computer games perceived their online environments as less friendly but experienced less sexual harassment online, were more aggressive themselves, and did not differ in gender identity, degree of sex role stereotyping, or acceptance of sexual violence when compared to women who used the computer but did not play video games. Finally, computer gaming was associated with decreased participation in computer-related employment; however, women with high masculine gender identities were more likely to use computers at work.

  16. Diagnostic utility of fluorodeoxyglucose positron emission tomography/computed tomography in pyrexia of unknown origin

    International Nuclear Information System (INIS)

    Singh, Nidhi; Kumar, Rakesh; Malhotra, Arun; Bhalla, Ashu Seith; Kumar, Uma; Sood, Rita

    2005-01-01

    The present study was undertaken to evaluate the diagnostic utility of fluorine-18 fluorodeoxyglucose positron emission tomography/computed tomography (F-18 FDG PET/CT) in patients presenting as pyrexia of unknown origin (PUO). Forty-seven patients (31 males and 16 females; mean age of 42.7 ± 19.96 years) presenting as PUO to the Department of Medicine at the All India Institute of Medical Sciences, New Delhi over a period of 2 years underwent F-18 FDG PET/CT. PET ⁄ CT was considered supportive when its results correlated with the final definitive diagnosis. Final diagnosis was made on the basis of combined evaluation of history, clinical findings, investigations, and response to treatment. Thirty-five PET/CT studies (74.5%) were positive. However, only 18 (38.3%) were supportive of the final diagnosis. In three patients (6.4%), PET/CT was considered diagnostic as none of the other investigations including contrast-enhanced computed tomography of chest and abdomen, and directed tissue sampling could lead to the final diagnosis. All these three patients were diagnosed as aortoarteritis. Fluorine-18 fluorodeoxyglucose positron emission tomography/computed tomography is an important emerging modality in the workup of PUO. It supported the final diagnosis in 38% of our patients and was diagnostic in 6.4% of patients. Thus, PET/CT should only be considered as second-line investigation for the diagnostic evaluation of PUO; especially in suspected noninfectious inflammatory disorders

  17. Computer Simulation of Multidimensional Archaeological Artefacts

    Directory of Open Access Journals (Sweden)

    Vera Moitinho de Almeida

    2012-11-01

    Our project focuses on the Neolithic lakeside site of La Draga (Banyoles, Catalonia. In this presentation we will begin by providing a clear overview of the major guidelines used to capture and process 3D digital data of several wooden artefacts. Then, we shall present the use of semi-automated relevant feature extractions. Finally, we intend to share preliminary computer simulation issues.

  18. A spacecraft computer repairable via command.

    Science.gov (United States)

    Fimmel, R. O.; Baker, T. E.

    1971-01-01

    The MULTIPAC is a central data system developed for deep-space probes with the distinctive feature that it may be repaired during flight via command and telemetry links by reprogramming around the failed unit. The computer organization uses pools of identical modules which the program organizes into one or more computers called processors. The interaction of these modules is dynamically controlled by the program rather than hardware. In the event of a failure, new programs are entered which reorganize the central data system with a somewhat reduced total processing capability aboard the spacecraft. Emphasis is placed on the evolution of the system architecture and the final overall system design rather than the specific logic design.

  19. Image processing with massively parallel computer Quadrics Q1

    International Nuclear Information System (INIS)

    Della Rocca, A.B.; La Porta, L.; Ferriani, S.

    1995-05-01

    Aimed to evaluate the image processing capabilities of the massively parallel computer Quadrics Q1, a convolution algorithm that has been implemented is described in this report. At first the discrete convolution mathematical definition is recalled together with the main Q1 h/w and s/w features. Then the different codification forms of the algorythm are described and the Q1 performances are compared with those obtained by different computers. Finally, the conclusions report on main results and suggestions

  20. Heterogeneous real-time computing in radio astronomy

    Science.gov (United States)

    Ford, John M.; Demorest, Paul; Ransom, Scott

    2010-07-01

    Modern computer architectures suited for general purpose computing are often not the best choice for either I/O-bound or compute-bound problems. Sometimes the best choice is not to choose a single architecture, but to take advantage of the best characteristics of different computer architectures to solve your problems. This paper examines the tradeoffs between using computer systems based on the ubiquitous X86 Central Processing Units (CPU's), Field Programmable Gate Array (FPGA) based signal processors, and Graphical Processing Units (GPU's). We will show how a heterogeneous system can be produced that blends the best of each of these technologies into a real-time signal processing system. FPGA's tightly coupled to analog-to-digital converters connect the instrument to the telescope and supply the first level of computing to the system. These FPGA's are coupled to other FPGA's to continue to provide highly efficient processing power. Data is then packaged up and shipped over fast networks to a cluster of general purpose computers equipped with GPU's, which are used for floating-point intensive computation. Finally, the data is handled by the CPU and written to disk, or further processed. Each of the elements in the system has been chosen for its specific characteristics and the role it can play in creating a system that does the most for the least, in terms of power, space, and money.

  1. On The Computational Capabilities of Physical Systems. Part 2; Relationship With Conventional Computer Science

    Science.gov (United States)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    , a bound similar to the "encoding" bound governing how much the algorithm information complexity of a Turing machine calculation can differ for two reference universal Turing machines. Finally, it is proven that either the Hamiltonian of our universe proscribes a certain type of computation, or prediction complexity is unique (unlike algorithmic information complexity), in that there is one and only version of it that can be applicable throughout our universe.

  2. Now and next-generation sequencing techniques: future of sequence analysis using cloud computing.

    Science.gov (United States)

    Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav

    2012-01-01

    Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed "cloud computing") has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows.

  3. An Algebra-Based Introductory Computational Neuroscience Course with Lab.

    Science.gov (United States)

    Fink, Christian G

    2017-01-01

    A course in computational neuroscience has been developed at Ohio Wesleyan University which requires no previous experience with calculus or computer programming, and which exposes students to theoretical models of neural information processing and techniques for analyzing neural data. The exploration of theoretical models of neural processes is conducted in the classroom portion of the course, while data analysis techniques are covered in lab. Students learn to program in MATLAB and are offered the opportunity to conclude the course with a final project in which they explore a topic of their choice within computational neuroscience. Results from a questionnaire administered at the beginning and end of the course indicate significant gains in student facility with core concepts in computational neuroscience, as well as with analysis techniques applied to neural data.

  4. Applied Computational Fluid Dynamics at NASA Ames Research Center

    Science.gov (United States)

    Holst, Terry L.; Kwak, Dochan (Technical Monitor)

    1994-01-01

    The field of Computational Fluid Dynamics (CFD) has advanced to the point where it can now be used for many applications in fluid mechanics research and aerospace vehicle design. A few applications being explored at NASA Ames Research Center will be presented and discussed. The examples presented will range in speed from hypersonic to low speed incompressible flow applications. Most of the results will be from numerical solutions of the Navier-Stokes or Euler equations in three space dimensions for general geometry applications. Computational results will be used to highlight the presentation as appropriate. Advances in computational facilities including those associated with NASA's CAS (Computational Aerosciences) Project of the Federal HPCC (High Performance Computing and Communications) Program will be discussed. Finally, opportunities for future research will be presented and discussed. All material will be taken from non-sensitive, previously-published and widely-disseminated work.

  5. Color evaluation of computer-generated color rainbow holography

    International Nuclear Information System (INIS)

    Shi, Yile; Wang, Hui; Wu, Qiong

    2013-01-01

    A color evaluation approach for computer-generated color rainbow holography (CGCRH) is presented. Firstly, the relationship between color quantities of a computer display and a color computer-generated holography (CCGH) colorimetric system is discussed based on color matching theory. An isochromatic transfer relationship of color quantity and amplitude of object light field is proposed. Secondly, the color reproduction mechanism and factors leading to the color difference between the color object and the holographic image that is reconstructed by CGCRH are analyzed in detail. A quantitative color calculation method for the holographic image reconstructed by CGCRH is given. Finally, general color samples are selected as numerical calculation test targets and the color differences between holographic images and test targets are calculated based on our proposed method. (paper)

  6. Systematic Errors in Dimensional X-ray Computed Tomography

    DEFF Research Database (Denmark)

    that it is possible to compensate them. In dimensional X-ray computed tomography (CT), many physical quantities influence the final result. However, it is important to know which factors in CT measurements potentially lead to systematic errors. In this talk, typical error sources in dimensional X-ray CT are discussed...

  7. Chinese Herbal Medicine Meets Biological Networks of Complex Diseases: A Computational Perspective

    Directory of Open Access Journals (Sweden)

    Shuo Gu

    2017-01-01

    Full Text Available With the rapid development of cheminformatics, computational biology, and systems biology, great progress has been made recently in the computational research of Chinese herbal medicine with in-depth understanding towards pharmacognosy. This paper summarized these studies in the aspects of computational methods, traditional Chinese medicine (TCM compound databases, and TCM network pharmacology. Furthermore, we chose arachidonic acid metabolic network as a case study to demonstrate the regulatory function of herbal medicine in the treatment of inflammation at network level. Finally, a computational workflow for the network-based TCM study, derived from our previous successful applications, was proposed.

  8. Chinese Herbal Medicine Meets Biological Networks of Complex Diseases: A Computational Perspective.

    Science.gov (United States)

    Gu, Shuo; Pei, Jianfeng

    2017-01-01

    With the rapid development of cheminformatics, computational biology, and systems biology, great progress has been made recently in the computational research of Chinese herbal medicine with in-depth understanding towards pharmacognosy. This paper summarized these studies in the aspects of computational methods, traditional Chinese medicine (TCM) compound databases, and TCM network pharmacology. Furthermore, we chose arachidonic acid metabolic network as a case study to demonstrate the regulatory function of herbal medicine in the treatment of inflammation at network level. Finally, a computational workflow for the network-based TCM study, derived from our previous successful applications, was proposed.

  9. Introducing Seismic Tomography with Computational Modeling

    Science.gov (United States)

    Neves, R.; Neves, M. L.; Teodoro, V.

    2011-12-01

    Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.

  10. Constructivity and computability in historical and philosophical perspective

    CERN Document Server

    Dubucs, Jacques

    2014-01-01

    Ranging from Alan Turing's seminal 1936 paper to the latest work on Kolmogorov complexity and linear logic, this comprehensive new work clarifies the relationship between computability on the one hand and constructivity on the other. The authors argue that even though constructivists have largely shed Brouwer's solipsistic attitude to logic, there remain points of disagreement to this day.Focusing on the growing pains computability experienced as it was forced to address the demands of rapidly expanding applications, the content maps the developments following Turing's ground-breaking linkage of computation and the machine, the resulting birth of complexity theory, the innovations of Kolmogorov complexity and resolving the dissonances between proof theoretical semantics and canonical proof feasibility. Finally, it explores one of the most fundamental questions concerning the interface between constructivity and computability: whether the theory of recursive functions is needed for a rigorous development of co...

  11. A computational description of simple mediation analysis

    Directory of Open Access Journals (Sweden)

    Caron, Pier-Olivier

    2018-04-01

    Full Text Available Simple mediation analysis is an increasingly popular statistical analysis in psychology and in other social sciences. However, there is very few detailed account of the computations within the model. Articles are more often focusing on explaining mediation analysis conceptually rather than mathematically. Thus, the purpose of the current paper is to introduce the computational modelling within simple mediation analysis accompanied with examples with R. Firstly, mediation analysis will be described. Then, the method to simulate data in R (with standardized coefficients will be presented. Finally, the bootstrap method, the Sobel test and the Baron and Kenny test all used to evaluate mediation (i.e., indirect effect will be developed. The R code to implement the computation presented is offered as well as a script to carry a power analysis and a complete example.

  12. Final Report: Symposium on Adaptive Methods for Partial Differential Equations

    Energy Technology Data Exchange (ETDEWEB)

    Pernice, M.; Johnson, C.R.; Smith, P.J.; Fogelson, A.

    1998-12-10

    OAK-B135 Final Report: Symposium on Adaptive Methods for Partial Differential Equations. Complex physical phenomena often include features that span a wide range of spatial and temporal scales. Accurate simulation of such phenomena can be difficult to obtain, and computations that are under-resolved can even exhibit spurious features. While it is possible to resolve small scale features by increasing the number of grid points, global grid refinement can quickly lead to problems that are intractable, even on the largest available computing facilities. These constraints are particularly severe for three dimensional problems that involve complex physics. One way to achieve the needed resolution is to refine the computational mesh locally, in only those regions where enhanced resolution is required. Adaptive solution methods concentrate computational effort in regions where it is most needed. These methods have been successfully applied to a wide variety of problems in computational science and engineering. Adaptive methods can be difficult to implement, prompting the development of tools and environments to facilitate their use. To ensure that the results of their efforts are useful, algorithm and tool developers must maintain close communication with application specialists. Conversely it remains difficult for application specialists who are unfamiliar with the methods to evaluate the trade-offs between the benefits of enhanced local resolution and the effort needed to implement an adaptive solution method.

  13. Predictor - Predictive Reaction Design via Informatics, Computation and Theories of Reactivity

    Science.gov (United States)

    2017-10-10

    Informatics, Computation and Theories of Reactivity Report Term: 0-Other Email : djtantillo@ucdavis.edu Distribution Statement: 1-Approved for public...Principal: Y Name: Dean J. Tantillo Email : djtantillo@ucdavis.edu RPPR Final Report as of 24-Nov-2017 Honors and Awards: Nothing to Report Protocol...meaningful queries is finding a balance between the amount of details in the metadata and computed results stored in the database vs. writing data

  14. CT applications of medical computer graphics

    International Nuclear Information System (INIS)

    Rhodes, M.L.

    1985-01-01

    Few applications of computer graphics show as much promise and early success as that for CT. Unlike electron microscopy, ultrasound, business, military, and animation applications, CT image data are inherently digital. CT pictures can be processed directly by programs well established in the fields of computer graphics and digital image processing. Methods for reformatting digital pictures, enhancing structure shape, reducing image noise, and rendering three-dimensional (3D) scenes of anatomic structures have all become routine at many CT centers. In this chapter, the authors provide a brief introduction to computer graphics terms and techniques commonly applied to CT pictures and, when appropriate, to those showing promise for magnetic resonance images. Topics discussed here are image-processing options that are applied to digital images already constructed. In the final portion of this chapter techniques for ''slicing'' CT image data are presented, and geometric principles that describe the specification of oblique and curved images are outlined. Clinical examples are included

  15. Advanced Scientific Computing Research Network Requirements: ASCR Network Requirements Review Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Bacon, Charles [Argonne National Lab. (ANL), Argonne, IL (United States); Bell, Greg [ESnet, Berkeley, CA (United States); Canon, Shane [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [ESnet, Berkeley, CA (United States); Dattoria, Vince [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Goodwin, Dave [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Lee, Jason [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hicks, Susan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Holohan, Ed [Argonne National Lab. (ANL), Argonne, IL (United States); Klasky, Scott [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lauzon, Carolyn [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Rogers, Jim [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shipman, Galen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Skinner, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Tierney, Brian [ESnet, Berkeley, CA (United States)

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  16. Computer-Based Tools for Evaluating Graphical User Interfaces

    Science.gov (United States)

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  17. Cloud Computing Security: A Survey

    Directory of Open Access Journals (Sweden)

    Issa M. Khalil

    2014-02-01

    Full Text Available Cloud computing is an emerging technology paradigm that migrates current technological and computing concepts into utility-like solutions similar to electricity and water systems. Clouds bring out a wide range of benefits including configurable computing resources, economic savings, and service flexibility. However, security and privacy concerns are shown to be the primary obstacles to a wide adoption of clouds. The new concepts that clouds introduce, such as multi-tenancy, resource sharing and outsourcing, create new challenges to the security community. Addressing these challenges requires, in addition to the ability to cultivate and tune the security measures developed for traditional computing systems, proposing new security policies, models, and protocols to address the unique cloud security challenges. In this work, we provide a comprehensive study of cloud computing security and privacy concerns. We identify cloud vulnerabilities, classify known security threats and attacks, and present the state-of-the-art practices to control the vulnerabilities, neutralize the threats, and calibrate the attacks. Additionally, we investigate and identify the limitations of the current solutions and provide insights of the future security perspectives. Finally, we provide a cloud security framework in which we present the various lines of defense and identify the dependency levels among them. We identify 28 cloud security threats which we classify into five categories. We also present nine general cloud attacks along with various attack incidents, and provide effectiveness analysis of the proposed countermeasures.

  18. Ethical aspects of final disposal. Final report

    International Nuclear Information System (INIS)

    Baltes, B.; Leder, W.; Achenbach, G.B.; Spaemann, R.; Gerhardt, V.

    2003-01-01

    In fulfilment of this task the Federal Environmental Ministry has commissioned GRS to summarise the current national and international status of ethical aspects of the final disposal of radioactive wastes as part of the project titled ''Final disposal of radioactive wastes as seen from the viewpoint of ethical objectives''. The questions arising from the opinions, positions and publications presented in the report by GRS were to serve as a basis for an expert discussion or an interdisciplinary discussion forum for all concerned with the ethical aspects of an answerable approach to the final disposal of radioactive wastes. In April 2001 GRS held a one-day seminar at which leading ethicists and philosophers offered statements on the questions referred to above and joined in a discussion with experts on issues of final disposal. This report documents the questions that arose ahead of the workshop, the specialist lectures held there and a summary of the discussion results [de

  19. Availability Perception And Constraints Of Final Year Students To The Use Of Computer-Based Information Communication Technologies Cb-Icts.

    Directory of Open Access Journals (Sweden)

    Nto

    2015-08-01

    Full Text Available There is no doubt that ICTs are the major focus for the day to day running of every society. ICTs create a room for a quicker faster easier access and exchange of information in the world today. The study investigated the availability attitude and constraints of final year students in the use of computer-based ICTs CB-ICTs in Abia State. Data were collected with the use of a well structured questionnaire. Data collected were analysed with descriptive statistics. Results from analysis revealed that the mean age of the respondents was 23 years 7 CB-ICTs were available to the respondents at varying degrees. The respondents had a positive attitude x amp773 3.11 to the use of CB-ICTs and the major constraint to their use of CB-ICTs was poor resource centre where they can access CB-ICTs. Based on the findings we recommended that resource centres should be built in the institution and if it exists should be well equipped and running. Equally awareness to the fact that there is or there will be a resource centre in the institution should be widely spread so that students can utilize the opportunity of its existence and make maximum use of the facilities there in. On the other hand internet service provider should scale up their services in the area so as to provide a more stable internet connection in the institution as this will enable the students to use more effectively CB-ICTs in the institution and for their academic work.

  20. MFTF sensor verification computer program

    International Nuclear Information System (INIS)

    Chow, H.K.

    1984-01-01

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system

  1. USAGE OF STANDARD PERSONAL COMPUTER PORTS FOR DESIGNING OF THE DOUBLE REDUNDANT FAULT-TOLERANT COMPUTER CONTROL SYSTEMS

    Directory of Open Access Journals (Sweden)

    Rafig SAMEDOV

    2005-01-01

    Full Text Available In this study, for designing of the fault-tolerant control systems by using standard personal computers, the ports have been investigated, different structure versions have been designed and the method for choosing of an optimal structure has been suggested. In this scope, first of all, the ÇİFTYAK system has been defined and its work principle has been determined. Then, data transmission ports of the standard personal computers have been classified and analyzed. After that, the structure versions have been designed and evaluated according to the used data transmission methods, the numbers of ports and the criterions of reliability, performance, truth, control and cost. Finally, the method for choosing of the most optimal structure version has been suggested.

  2. Computational Toxicology as Implemented by the US EPA ...

    Science.gov (United States)

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the T

  3. Abstraction ability as an indicator of success for learning computing science?

    DEFF Research Database (Denmark)

    Bennedsen, Jens; Caspersen, Michael Edelgaard

    2008-01-01

    Computing scientists generally agree that abstract thinking is a crucial component for practicing computer science. We report on a three-year longitudinal study to confirm the hypothesis that general abstraction ability has a positive impact on performance in computing science. Abstraction ability...... is operationalized as stages of cognitive development for which validated tests exist. Performance in computing science is operationalized as grade in the final assessment of ten courses of a bachelor's degree programme in computing science. The validity of the operationalizations is discussed. We have investigated...... the positive impact overall, for two groupings of courses (a content-based grouping and a grouping based on SOLO levels of the courses' intended learning outcome), and for each individual course. Surprisingly, our study shows that there is hardly any correlation between stage of cognitive development...

  4. Exploiting graphics processing units for computational biology and bioinformatics.

    Science.gov (United States)

    Payne, Joshua L; Sinnott-Armstrong, Nicholas A; Moore, Jason H

    2010-09-01

    Advances in the video gaming industry have led to the production of low-cost, high-performance graphics processing units (GPUs) that possess more memory bandwidth and computational capability than central processing units (CPUs), the standard workhorses of scientific computing. With the recent release of generalpurpose GPUs and NVIDIA's GPU programming language, CUDA, graphics engines are being adopted widely in scientific computing applications, particularly in the fields of computational biology and bioinformatics. The goal of this article is to concisely present an introduction to GPU hardware and programming, aimed at the computational biologist or bioinformaticist. To this end, we discuss the primary differences between GPU and CPU architecture, introduce the basics of the CUDA programming language, and discuss important CUDA programming practices, such as the proper use of coalesced reads, data types, and memory hierarchies. We highlight each of these topics in the context of computing the all-pairs distance between instances in a dataset, a common procedure in numerous disciplines of scientific computing. We conclude with a runtime analysis of the GPU and CPU implementations of the all-pairs distance calculation. We show our final GPU implementation to outperform the CPU implementation by a factor of 1700.

  5. A survey of current trends in computational drug repositioning.

    Science.gov (United States)

    Li, Jiao; Zheng, Si; Chen, Bin; Butte, Atul J; Swamidass, S Joshua; Lu, Zhiyong

    2016-01-01

    Computational drug repositioning or repurposing is a promising and efficient tool for discovering new uses from existing drugs and holds the great potential for precision medicine in the age of big data. The explosive growth of large-scale genomic and phenotypic data, as well as data of small molecular compounds with granted regulatory approval, is enabling new developments for computational repositioning. To achieve the shortest path toward new drug indications, advanced data processing and analysis strategies are critical for making sense of these heterogeneous molecular measurements. In this review, we show recent advancements in the critical areas of computational drug repositioning from multiple aspects. First, we summarize available data sources and the corresponding computational repositioning strategies. Second, we characterize the commonly used computational techniques. Third, we discuss validation strategies for repositioning studies, including both computational and experimental methods. Finally, we highlight potential opportunities and use-cases, including a few target areas such as cancers. We conclude with a brief discussion of the remaining challenges in computational drug repositioning. Published by Oxford University Press 2015. This work is written by US Government employees and is in the public domain in the US.

  6. Data management and language enhancement for generalized set theory computer language for operation of large relational databases

    Science.gov (United States)

    Finley, Gail T.

    1988-01-01

    This report covers the study of the relational database implementation in the NASCAD computer program system. The existing system is used primarily for computer aided design. Attention is also directed to a hidden-surface algorithm for final drawing output.

  7. SCEAPI: A unified Restful Web API for High-Performance Computing

    Science.gov (United States)

    Rongqiang, Cao; Haili, Xiao; Shasha, Lu; Yining, Zhao; Xiaoning, Wang; Xuebin, Chi

    2017-10-01

    The development of scientific computing is increasingly moving to collaborative web and mobile applications. All these applications need high-quality programming interface for accessing heterogeneous computing resources consisting of clusters, grid computing or cloud computing. In this paper, we introduce our high-performance computing environment that integrates computing resources from 16 HPC centers across China. Then we present a bundle of web services called SCEAPI and describe how it can be used to access HPC resources with HTTP or HTTPs protocols. We discuss SCEAPI from several aspects including architecture, implementation and security, and address specific challenges in designing compatible interfaces and protecting sensitive data. We describe the functions of SCEAPI including authentication, file transfer and job management for creating, submitting and monitoring, and how to use SCEAPI in an easy-to-use way. Finally, we discuss how to exploit more HPC resources quickly for the ATLAS experiment by implementing the custom ARC compute element based on SCEAPI, and our work shows that SCEAPI is an easy-to-use and effective solution to extend opportunistic HPC resources.

  8. Computing on Encrypted Data: Theory and Application

    Science.gov (United States)

    2016-01-01

    permits short ciphertexts – e.g., encrypted using AES – to be de-compressed to longer ciphertexts that permit homomorphic operations. Bootstrapping...allows us to save memory by storing data encrypted in the compressed form – e.g., under AES . Here, we revisit bootstrapping, viewing it as an...COMPUTING ON ENCRYPTED DATA: THEORY AND APPLICATION MASSACHUSETTS INSTITUTE OF TECHNOLOGY JANUARY 2016 FINAL TECHNICAL REPORT

  9. Computed tomography of the head in neurological examination of children

    International Nuclear Information System (INIS)

    Baeckman, E.; Egg-Olofsson, O.; Raadberg, C.

    1980-01-01

    A total of 247 children from the departments of pediatrics and neurosurgery were examined with computed tomography of the head during a two year period in 1977-78. Pathological changes were demonstrated in 79 per cent. Supplementary neuro-radiological examination - angiography and encephalography - was necessary in 17 per cent. Computed tomography together with the clinical assessment frequently suffices for final diagnosis. Computed tomography greatly reduces the need for previously used neurological examinations including skull radiography. Complications may ensure because of over-sensitivity to intravenously administered contrast medium in connection with anesthesia, and the radiation dose particularly to the crystalline lens of the eye must be taken into account. Computed tomography should therefore be used only on strict indications after careful scrutiny of the case history and the status. (author)

  10. Development of an electrical impedance computed tomographic two-phase flows analyzer. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Ovacik, L.; Jones, O.C.

    1998-08-01

    This report summarizes the work on the research project on this cooperative program between DOE and Hitachi, Ltd. Major advances were made in the computational reconstruction of images from electrical excitation and response data with respect to existing capabilities reported in the literature. A demonstration is provided of the imaging of one or more circular objects within the measurement plane with demonstrated linear resolution of six parts in two hundred. At this point it can be said that accurate excitation and measurement of boundary voltages and currents appears adequate to obtain reasonable images of the real conductivity distribution within a body and the outlines of insulating targets suspended within a homogeneous conducting medium. The quality of images is heavily dependent on the theoretical and numerical implementation of imaging algorithms. The overall imaging system described has the potential of being both fast and cost effective in comparison with alternative methods. The methods developed use multiple plate-electrode excitation in conjunction with finite element block decomposition, preconditioned voltage conversion, layer approximation of the third dimension and post processing of boundary measurements to obtain optimal boundary excitations. Reasonably accurate imaging of single and multiple targets of differing size, location and separation is demonstrated and the resulting images are better than any others found in the literature. Recommendations for future effort include the improvement in computational algorithms with emphasis on internal conductivity shape functions and the use of adaptive development of quadrilateral (2-D) or tetrahedral or hexahedral (3-D) elements to coincide with large discrete zone boundaries in the fields, development of a truly binary model and completion of a fast imaging system. Further, the rudimentary methods shown herein for three-dimensional imaging need improving.

  11. A Novel Cloud Computing Algorithm of Security and Privacy

    Directory of Open Access Journals (Sweden)

    Chih-Yung Chen

    2013-01-01

    Full Text Available The emergence of cloud computing has simplified the flow of large-scale deployment distributed system of software suppliers; when issuing respective application programs in a sharing clouds service to different user, the management of material becomes more complex. Therefore, in multitype clouds service of trust environment, when enterprises face cloud computing, what most worries is the issue of security, but individual users are worried whether the privacy material will have an outflow risk. This research has mainly analyzed several different construction patterns of cloud computing, and quite relevant case in the deployment construction security of cloud computing by fit and unfit quality, and proposed finally an optimization safe deployment construction of cloud computing and security mechanism of material protection calculating method, namely, Global Authentication Register System (GARS, to reduce cloud material outflow risk. We implemented a system simulation to test the GARS algorithm of availability, security and performance. By experimental data analysis, the solutions of cloud computing security, and privacy derived from the research can be effective protection in cloud information security. Moreover, we have proposed cloud computing in the information security-related proposals that would provide related units for the development of cloud computing security practice.

  12. The first stage of BFS integrated system for nuclear materials control and accounting. Final report

    International Nuclear Information System (INIS)

    1996-09-01

    The BFS computerized accounting system is a network-based one. It runs in a client/server mode. The equipment used in the system includes a computer network consisting of: One server computer system, including peripheral hardware and three client computer systems. The server is located near the control room of the BFS-2 facility outside of the 'stone sack' to ensure access during operation of the critical assemblies. Two of the client computer systems are located near the assembly tables of the BFS-1 and BFS-2 facilities while the third one being the Fissile Material Storage. This final report details the following topics: Computerized nuclear material accounting methods; The portal monitoring system; Test and evaluation of item control technology; Test and evaluation of radiation based nuclear material measurement equipment; and The integrated demonstration of nuclear material control and accounting methods

  13. Computer loss experience and predictions

    Science.gov (United States)

    Parker, Donn B.

    1996-03-01

    crypto without control), Internet abuse (antisocial use of data communications), and international industrial espionage (governments stealing business secrets). A wide variety of safeguards are necessary to deal with these new crimes. The most powerful controls include (1) carefully controlled use of cryptography and digital signatures with good key management and overriding business and government decryption capability and (2) use of tokens such as smart cards to increase the strength of secret passwords for authentication of computer users. Jewelry-type security for small computers--including registration of serial numbers and security inventorying of equipment, software, and connectivity--will be necessary. Other safeguards include automatic monitoring of computer use and detection of unusual activities, segmentation and filtering of networks, special paper and ink for documents, and reduction of paper documents. Finally, international cooperation of governments to create trusted environments for business is essential.

  14. Multiscale Methods, Parallel Computation, and Neural Networks for Real-Time Computer Vision.

    Science.gov (United States)

    Battiti, Roberto

    1990-01-01

    This thesis presents new algorithms for low and intermediate level computer vision. The guiding ideas in the presented approach are those of hierarchical and adaptive processing, concurrent computation, and supervised learning. Processing of the visual data at different resolutions is used not only to reduce the amount of computation necessary to reach the fixed point, but also to produce a more accurate estimation of the desired parameters. The presented adaptive multiple scale technique is applied to the problem of motion field estimation. Different parts of the image are analyzed at a resolution that is chosen in order to minimize the error in the coefficients of the differential equations to be solved. Tests with video-acquired images show that velocity estimation is more accurate over a wide range of motion with respect to the homogeneous scheme. In some cases introduction of explicit discontinuities coupled to the continuous variables can be used to avoid propagation of visual information from areas corresponding to objects with different physical and/or kinematic properties. The human visual system uses concurrent computation in order to process the vast amount of visual data in "real -time." Although with different technological constraints, parallel computation can be used efficiently for computer vision. All the presented algorithms have been implemented on medium grain distributed memory multicomputers with a speed-up approximately proportional to the number of processors used. A simple two-dimensional domain decomposition assigns regions of the multiresolution pyramid to the different processors. The inter-processor communication needed during the solution process is proportional to the linear dimension of the assigned domain, so that efficiency is close to 100% if a large region is assigned to each processor. Finally, learning algorithms are shown to be a viable technique to engineer computer vision systems for different applications starting from

  15. Basic concepts in computational physics

    CERN Document Server

    Stickler, Benjamin A

    2016-01-01

    This new edition is a concise introduction to the basic methods of computational physics. Readers will discover the benefits of numerical methods for solving complex mathematical problems and for the direct simulation of physical processes. The book is divided into two main parts: Deterministic methods and stochastic methods in computational physics. Based on concrete problems, the first part discusses numerical differentiation and integration, as well as the treatment of ordinary differential equations. This is extended by a brief introduction to the numerics of partial differential equations. The second part deals with the generation of random numbers, summarizes the basics of stochastics, and subsequently introduces Monte-Carlo (MC) methods. Specific emphasis is on MARKOV chain MC algorithms. The final two chapters discuss data analysis and stochastic optimization. All this is again motivated and augmented by applications from physics. In addition, the book offers a number of appendices to provide the read...

  16. Open Compute Project at CERN

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    The Open Compute Project, OCP ( http://www.opencompute.org/), was launched by Facebook in 2011 with the objective of building efficient computing infrastructures at lowest possible cost. The technologies are released as open hardware design, with the goal to develop servers and data centers following the model traditionally associated with open source software projects. We have been following the OCP project for some time and decided to buy two OCP twin servers in 2013 to get some hands-on experience. The servers have been tested and compared with our standard hardware regularly acquired through large tenders. In this presentation we will give some relevant results from this testing and also discuss some of the more important differences that can matter for a larger deployment at CERN. Finally it will outline the details for a possible project for a larger deployment of OCP hardware for production use at CERN.

  17. Computational Modeling in Tissue Engineering

    CERN Document Server

    2013-01-01

    One of the major challenges in tissue engineering is the translation of biological knowledge on complex cell and tissue behavior into a predictive and robust engineering process. Mastering this complexity is an essential step towards clinical applications of tissue engineering. This volume discusses computational modeling tools that allow studying the biological complexity in a more quantitative way. More specifically, computational tools can help in:  (i) quantifying and optimizing the tissue engineering product, e.g. by adapting scaffold design to optimize micro-environmental signals or by adapting selection criteria to improve homogeneity of the selected cell population; (ii) quantifying and optimizing the tissue engineering process, e.g. by adapting bioreactor design to improve quality and quantity of the final product; and (iii) assessing the influence of the in vivo environment on the behavior of the tissue engineering product, e.g. by investigating vascular ingrowth. The book presents examples of each...

  18. Limits on efficient computation in the physical world

    Science.gov (United States)

    Aaronson, Scott Joel

    More than a speculative technology, quantum computing seems to challenge our most basic intuitions about how the physical world should behave. In this thesis I show that, while some intuitions from classical computer science must be jettisoned in the light of modern physics, many others emerge nearly unscathed; and I use powerful tools from computational complexity theory to help determine which are which. In the first part of the thesis, I attack the common belief that quantum computing resembles classical exponential parallelism, by showing that quantum computers would face serious limitations on a wider range of problems than was previously known. In particular, any quantum algorithm that solves the collision problem---that of deciding whether a sequence of n integers is one-to-one or two-to-one---must query the sequence O (n1/5) times. This resolves a question that was open for years; previously no lower bound better than constant was known. A corollary is that there is no "black-box" quantum algorithm to break cryptographic hash functions or solve the Graph Isomorphism problem in polynomial time. I also show that relative to an oracle, quantum computers could not solve NP-complete problems in polynomial time, even with the help of nonuniform "quantum advice states"; and that any quantum algorithm needs O (2n/4/n) queries to find a local minimum of a black-box function on the n-dimensional hypercube. Surprisingly, the latter result also leads to new classical lower bounds for the local search problem. Finally, I give new lower bounds on quantum one-way communication complexity, and on the quantum query complexity of total Boolean functions and recursive Fourier sampling. The second part of the thesis studies the relationship of the quantum computing model to physical reality. I first examine the arguments of Leonid Levin, Stephen Wolfram, and others who believe quantum computing to be fundamentally impossible. I find their arguments unconvincing without a "Sure

  19. Brain-Computer Interfaces in Medicine

    Science.gov (United States)

    Shih, Jerry J.; Krusienski, Dean J.; Wolpaw, Jonathan R.

    2012-01-01

    Brain-computer interfaces (BCIs) acquire brain signals, analyze them, and translate them into commands that are relayed to output devices that carry out desired actions. BCIs do not use normal neuromuscular output pathways. The main goal of BCI is to replace or restore useful function to people disabled by neuromuscular disorders such as amyotrophic lateral sclerosis, cerebral palsy, stroke, or spinal cord injury. From initial demonstrations of electroencephalography-based spelling and single-neuron-based device control, researchers have gone on to use electroencephalographic, intracortical, electrocorticographic, and other brain signals for increasingly complex control of cursors, robotic arms, prostheses, wheelchairs, and other devices. Brain-computer interfaces may also prove useful for rehabilitation after stroke and for other disorders. In the future, they might augment the performance of surgeons or other medical professionals. Brain-computer interface technology is the focus of a rapidly growing research and development enterprise that is greatly exciting scientists, engineers, clinicians, and the public in general. Its future achievements will depend on advances in 3 crucial areas. Brain-computer interfaces need signal-acquisition hardware that is convenient, portable, safe, and able to function in all environments. Brain-computer interface systems need to be validated in long-term studies of real-world use by people with severe disabilities, and effective and viable models for their widespread dissemination must be implemented. Finally, the day-to-day and moment-to-moment reliability of BCI performance must be improved so that it approaches the reliability of natural muscle-based function. PMID:22325364

  20. Computational Modeling in Liver Surgery

    Directory of Open Access Journals (Sweden)

    Bruno Christ

    2017-11-01

    Full Text Available The need for extended liver resection is increasing due to the growing incidence of liver tumors in aging societies. Individualized surgical planning is the key for identifying the optimal resection strategy and to minimize the risk of postoperative liver failure and tumor recurrence. Current computational tools provide virtual planning of liver resection by taking into account the spatial relationship between the tumor and the hepatic vascular trees, as well as the size of the future liver remnant. However, size and function of the liver are not necessarily equivalent. Hence, determining the future liver volume might misestimate the future liver function, especially in cases of hepatic comorbidities such as hepatic steatosis. A systems medicine approach could be applied, including biological, medical, and surgical aspects, by integrating all available anatomical and functional information of the individual patient. Such an approach holds promise for better prediction of postoperative liver function and hence improved risk assessment. This review provides an overview of mathematical models related to the liver and its function and explores their potential relevance for computational liver surgery. We first summarize key facts of hepatic anatomy, physiology, and pathology relevant for hepatic surgery, followed by a description of the computational tools currently used in liver surgical planning. Then we present selected state-of-the-art computational liver models potentially useful to support liver surgery. Finally, we discuss the main challenges that will need to be addressed when developing advanced computational planning tools in the context of liver surgery.

  1. Challenges and opportunities of cloud computing for atmospheric sciences

    Science.gov (United States)

    Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.

    2016-04-01

    Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.

  2. Generalized Bell-inequality experiments and computation

    Energy Technology Data Exchange (ETDEWEB)

    Hoban, Matty J. [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom); Department of Computer Science, University of Oxford, Wolfson Building, Parks Road, Oxford OX1 3QD (United Kingdom); Wallman, Joel J. [School of Physics, The University of Sydney, Sydney, New South Wales 2006 (Australia); Browne, Dan E. [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom)

    2011-12-15

    We consider general settings of Bell inequality experiments with many parties, where each party chooses from a finite number of measurement settings each with a finite number of outcomes. We investigate the constraints that Bell inequalities place upon the correlations possible in local hidden variable theories using a geometrical picture of correlations. We show that local hidden variable theories can be characterized in terms of limited computational expressiveness, which allows us to characterize families of Bell inequalities. The limited computational expressiveness for many settings (each with many outcomes) generalizes previous results about the many-party situation each with a choice of two possible measurements (each with two outcomes). Using this computational picture we present generalizations of the Popescu-Rohrlich nonlocal box for many parties and nonbinary inputs and outputs at each site. Finally, we comment on the effect of preprocessing on measurement data in our generalized setting and show that it becomes problematic outside of the binary setting, in that it allows local hidden variable theories to simulate maximally nonlocal correlations such as those of these generalized Popescu-Rohrlich nonlocal boxes.

  3. Generalized Bell-inequality experiments and computation

    International Nuclear Information System (INIS)

    Hoban, Matty J.; Wallman, Joel J.; Browne, Dan E.

    2011-01-01

    We consider general settings of Bell inequality experiments with many parties, where each party chooses from a finite number of measurement settings each with a finite number of outcomes. We investigate the constraints that Bell inequalities place upon the correlations possible in local hidden variable theories using a geometrical picture of correlations. We show that local hidden variable theories can be characterized in terms of limited computational expressiveness, which allows us to characterize families of Bell inequalities. The limited computational expressiveness for many settings (each with many outcomes) generalizes previous results about the many-party situation each with a choice of two possible measurements (each with two outcomes). Using this computational picture we present generalizations of the Popescu-Rohrlich nonlocal box for many parties and nonbinary inputs and outputs at each site. Finally, we comment on the effect of preprocessing on measurement data in our generalized setting and show that it becomes problematic outside of the binary setting, in that it allows local hidden variable theories to simulate maximally nonlocal correlations such as those of these generalized Popescu-Rohrlich nonlocal boxes.

  4. The Guide to Better Hospital Computer Decisions

    Science.gov (United States)

    Dorenfest, Sheldon I.

    1981-01-01

    A soon-to-be-published major study of hospital computer use entitled “The Guide to Better Hospital Computer Decisions” was conducted by my firm over the past 2½ years. The study required over twenty (20) man years of effort at a cost of over $300,000, and the six (6) volume final report provides more than 1,000 pages of data about how hospitals are and will be using computerized medical and business information systems. It describes the current status and future expectations for computer use in major application areas, such as, but not limited to, finance, admitting, pharmacy, laboratory, data collection and hospital or medical information systems. It also includes profiles of over 100 companies and other types of organizations providing data processing products and services to hospitals. In this paper, we discuss the need for the study, the specific objectives of the study, the methodology and approach taken to complete the study and a few major conclusions.

  5. Computer simulation games in population and education.

    Science.gov (United States)

    Moreland, R S

    1988-01-01

    Computer-based simulation games are effective training tools that have several advantages. They enable players to learn in a nonthreatening manner and develop strategies to achieve goals in a dynamic environment. They also provide visual feedback on the effects of players' decisions, encourage players to explore and experiment with options before making final decisions, and develop players' skills in analysis, decision making, and cooperation. 2 games have been developed by the Research Triangle Institute for public-sector planning agencies interested in or dealing with developing countries. The UN Population and Development Game teaches players about the interaction between population variables and the national economy and how population policies complement other national policies, such as education. The BRIDGES Education Planning Game focuses on the effects education has on national policies. In both games, the computer simulates the reactions of a fictional country's socioeconomic system to players' decisions. Players can change decisions after seeing their effects on a computer screen and thus can improve their performance in achieving goals.

  6. FINAL REPORT DE-FG02-04ER41317 Advanced Computation and Chaotic Dynamics for Beams and Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Cary, John R [U. Colorado

    2014-09-08

    During the year ending in August 2013, we continued to investigate the potential of photonic crystal (PhC) materials for acceleration purposes. We worked to characterize acceleration ability of simple PhC accelerator structures, as well as to characterize PhC materials to determine whether current fabrication techniques can meet the needs of future accelerating structures. We have also continued to design and optimize PhC accelerator structures, with the ultimate goal of finding a new kind of accelerator structure that could offer significant advantages over current RF acceleration technology. This design and optimization of these requires high performance computation, and we continue to work on methods to make such computation faster and more efficient.

  7. Latent-failure risk estimates for computer control

    Science.gov (United States)

    Dunn, William R.; Folsom, Rolfe A.; Green, Owen R.

    1991-01-01

    It is shown that critical computer controls employing unmonitored safety circuits are unsafe. Analysis supporting this result leads to two additional, important conclusions: (1) annual maintenance checks of safety circuit function do not, as widely believed, eliminate latent failure risk; (2) safety risk remains even if multiple, series-connected protection circuits are employed. Finally, it is shown analytically that latent failure risk is eliminated when continuous monitoring is employed.

  8. Blind quantum computation protocol in which Alice only makes measurements

    Science.gov (United States)

    Morimae, Tomoyuki; Fujii, Keisuke

    2013-05-01

    Blind quantum computation is a new secure quantum computing protocol which enables Alice (who does not have sufficient quantum technology) to delegate her quantum computation to Bob (who has a full-fledged quantum computer) in such a way that Bob cannot learn anything about Alice's input, output, and algorithm. In previous protocols, Alice needs to have a device which generates quantum states, such as single-photon states. Here we propose another type of blind computing protocol where Alice does only measurements, such as the polarization measurements with a threshold detector. In several experimental setups, such as optical systems, the measurement of a state is much easier than the generation of a single-qubit state. Therefore our protocols ease Alice's burden. Furthermore, the security of our protocol is based on the no-signaling principle, which is more fundamental than quantum physics. Finally, our protocols are device independent in the sense that Alice does not need to trust her measurement device in order to guarantee the security.

  9. Efficient Probability of Failure Calculations for QMU using Computational Geometry LDRD 13-0144 Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Scott A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ebeida, Mohamed Salah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Romero, Vicente J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rushdi, Ahmad A. [Univ. of Texas, Austin, TX (United States); Abdelkader, Ahmad [Univ. of Maryland, College Park, MD (United States)

    2015-09-01

    This SAND report summarizes our work on the Sandia National Laboratory LDRD project titled "Efficient Probability of Failure Calculations for QMU using Computational Geometry" which was project #165617 and proposal #13-0144. This report merely summarizes our work. Those interested in the technical details are encouraged to read the full published results, and contact the report authors for the status of the software and follow-on projects.

  10. IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report

    Energy Technology Data Exchange (ETDEWEB)

    William M. Bond; Salih Ersayin

    2007-03-30

    This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency of individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern

  11. FY05 LDRD Final Report A Computational Design Tool for Microdevices and Components in Pathogen Detection Systems

    Energy Technology Data Exchange (ETDEWEB)

    Trebotich, D

    2006-02-07

    We have developed new algorithms to model complex biological flows in integrated biodetection microdevice components. The proposed work is important because the design strategy for the next-generation Autonomous Pathogen Detection System at LLNL is the microfluidic-based Biobriefcase, being developed under the Chemical and Biological Countermeasures Program in the Homeland Security Organization. This miniaturization strategy introduces a new flow regime to systems where biological flow is already complex and not well understood. Also, design and fabrication of MEMS devices is time-consuming and costly due to the current trial-and-error approach. Furthermore, existing devices, in general, are not optimized. There are several MEMS CAD capabilities currently available, but their computational fluid dynamics modeling capabilities are rudimentary at best. Therefore, we proposed a collaboration to develop computational tools at LLNL which will (1) provide critical understanding of the fundamental flow physics involved in bioMEMS devices, (2) shorten the design and fabrication process, and thus reduce costs, (3) optimize current prototypes and (4) provide a prediction capability for the design of new, more advanced microfluidic systems. Computational expertise was provided by Comp-CASC and UC Davis-DAS. The simulation work was supported by key experiments for guidance and validation at UC Berkeley-BioE.

  12. Quantum computers in phase space

    International Nuclear Information System (INIS)

    Miquel, Cesar; Paz, Juan Pablo; Saraceno, Marcos

    2002-01-01

    We represent both the states and the evolution of a quantum computer in phase space using the discrete Wigner function. We study properties of the phase space representation of quantum algorithms: apart from analyzing important examples, such as the Fourier transform and Grover's search, we examine the conditions for the existence of a direct correspondence between quantum and classical evolutions in phase space. Finally, we describe how to measure directly the Wigner function in a given phase-space point by means of a tomographic method that, itself, can be interpreted as a simple quantum algorithm

  13. The Application of Visual Basic Computer Programming Language to Simulate Numerical Iterations

    Directory of Open Access Journals (Sweden)

    Abdulkadir Baba HASSAN

    2006-06-01

    Full Text Available This paper examines the application of Visual Basic Computer Programming Language to Simulate Numerical Iterations, the merit of Visual Basic as a Programming Language and the difficulties faced when solving numerical iterations analytically, this research paper encourage the uses of Computer Programming methods for the execution of numerical iterations and finally fashion out and develop a reliable solution using Visual Basic package to write a program for some selected iteration problems.

  14. Proceedings of the Third International Conference on Trends in Information, Telecommunication and Computing

    CERN Document Server

    2013-01-01

    The Proceedings of the Third International Conference on Trends in Information, Telecommunication and Computing provides in-depth understanding of the fundamental challenges in the fields of Computational Engineering, Computer, Power Electronics, Instrumentation, Control System, and Telecommunication Technology. This book provides a broad vision for the future of research in these fields with ideas on how to support these new technologies currently practice. Every submitted paper received a careful review from the committee and the final accept/reject decisions were made by the co-chairs on the bases of recommendations from the committee members.

  15. State-of-the-art technology for an extended computing centre

    CERN Multimedia

    Anaïs Schaeffer

    2013-01-01

    On 7 May, CERN’s Director-General, Rolf Heuer, the Director for Research and Computing, Sergio Bertolucci, the EN Department Head, Roberto Saban, and several guests joined the IT Department Head, Frédéric Hemmer, for the inauguration of the new facilities at the CERN Computing Centre.   One of the new ventilation units and a big duct, installed as part of the Computing Centre consolidation project. After nearly two years of work, the IT Department now boasts a new computer room, equipped with its own cooling system to house the Computing Centre’s critical IT systems, which can, from now on, be decoupled from the other systems in the building. New electrical facilities have been added too, boosting the Centre’s computing power from 2.9 to 3.5 MW. Finally, an additional 40 cubic-metre water tank has been installed to allow continued cooling of the IT systems in the event of a major incident. But the star attraction of the extension project has ...

  16. Using Infrastructure Awareness to Support the Recruitment of Volunteer Computing Participants

    DEFF Research Database (Denmark)

    Ramos, Juan David Hincapie

    , the properties of computational infrastructures provided in the periphery of the user’s attention, and supporting gradual disclosure of detailed information on user’s request. Working with users of the Mini-Grid, this thesis shows the design process of two infrastructure awareness systems aimed at supporting...... the recruitment of participants, the implementation of one possible technical strategy, and an in-the-wild evaluation. The thesis finalizes with a discussion of the results and implications of infrastructure awareness for participative and other computational infrastructures....

  17. The Final Merger of Massive Black Holes: Recoils, Gravitational Waves, and Electromagnetic Signatures

    Science.gov (United States)

    Centrella, Joan M.

    2010-01-01

    The final merger of two massive black holes produces a powerful burst of gravitational radiation, emitting more energy than all the stars in the observable universe combined. The resulting gravitational waveforms will be easily detectable by the space-based LISA out to redshifts z greater than 10, revealing the masses and spins of the black holes to high precision. If the merging black holes have unequal masses, or asymmetric spins, the final black hole that forms can recoil with a velocity exceeding 1000 km/s. And, when the black holes merge in the presence of gas and magnetic fields, various types of electromagnetic signals may also be produced. For more than 30 years, scientists have tried to compute black hole mergers using the methods of numerical relativity. The resulting computer codes have been plagued by instabilities, causing them to crash well before the black holes in the binary could complete even a single orbit. Within the past few years, however, this situation has changed dramatically, with a series of remarkable breakthroughs. This talk will focus on new results that are revealing the dynamics and waveforms of binary black hole mergers, recoil velocities, and the possibility of accompanying electromagnetic outbursts.

  18. Reciprocity in computer-human interaction: source-based, norm-based, and affect-based explanations.

    Science.gov (United States)

    Lee, Seungcheol Austin; Liang, Yuhua Jake

    2015-04-01

    Individuals often apply social rules when they interact with computers, and this is known as the Computers Are Social Actors (CASA) effect. Following previous work, one approach to understand the mechanism responsible for CASA is to utilize computer agents and have the agents attempt to gain human compliance (e.g., completing a pattern recognition task). The current study focuses on three key factors frequently cited to influence traditional notions of compliance: evaluations toward the source (competence and warmth), normative influence (reciprocity), and affective influence (mood). Structural equation modeling assessed the effects of these factors on human compliance with computer request. The final model shows that norm-based influence (reciprocity) increased the likelihood of compliance, while evaluations toward the computer agent did not significantly influence compliance.

  19. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  20. XXVII IUPAP Conference on Computational Physics (CCP2015)

    International Nuclear Information System (INIS)

    Santra, Sitangshu Bikas; Ray, Purusattam

    2016-01-01

    and Soft Materials, Supercomputing and Computational Physics Teaching, Computational Physics and Sustainable Energy. As organizers and editors of these Proceedings, we are very pleased with the number and the quality of the papers provided by the participants. The papers cover a good cross-section of what was presented at the meeting. We are sure that these will represent the state of the computational physics today. The remainder of this Preface contains lists detailing the organizational structure of CCP2015, endorsers and sponsors of the meeting, plenary and invited talks, and a presentation of the 2015 IUPAP C20 Young Scientist Prize. Finally, we would like to express our sincere thanks to our sponsors and endorsers: C20 Commission of International Union of Pure and Applied Physics (IUPAP); Division of Computational Physics, American Physical Society (APS); European Physicsl Society (EPS); Association of Asia Pacific Physical Societies (AAAPS), Intel Software; Nvidia; Fujitsu; Netweb Technologies; The Institute of Mathematical Sciences Chennai, and finally Indian Institute of Technology Guwahati. We are grateful to the organizing committee, the International Advisory Board, the local organizing committee and participants who helped in making CCP2015 a success. (paper)

  1. Computation of Asteroid Proper Elements on the Grid

    Science.gov (United States)

    Novakovic, B.; Balaz, A.; Knezevic, Z.; Potocnik, M.

    2009-12-01

    A procedure of gridification of the computation of asteroid proper orbital elements is described. The need to speed up the time consuming computations and make them more efficient is justified by the large increase of observational data expected from the next generation all sky surveys. We give the basic notion of proper elements and of the contemporary theories and methods used to compute them for different populations of objects. Proper elements for nearly 70,000 asteroids are derived since the beginning of use of the Grid infrastructure for the purpose. The average time for the catalogs update is significantly shortened with respect to the time needed with stand-alone workstations. We also present basics of the Grid computing, the concepts of Grid middleware and its Workload management system. The practical steps we undertook to efficiently gridify our application are described in full detail. We present the results of a comprehensive testing of the performance of different Grid sites, and offer some practical conclusions based on the benchmark results and on our experience. Finally, we propose some possibilities for the future work.

  2. The Study of Pallet Pooling Information Platform Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jia-bin Li

    2018-01-01

    Full Text Available Effective implementation of pallet pooling system needs a strong information platform to support. Through the analysis of existing pallet pooling information platform (PPIP, the paper pointed out that the existing studies of PPIP are mainly based on traditional IT infrastructures and technologies which have software, hardware, resource utilization, and process restrictions. Because of the advantages of cloud computing technology like strong computing power, high flexibility, and low cost which meet the requirements of the PPIP well, this paper gave a PPIP architecture of two parts based on cloud computing: the users client and the cloud services. The cloud services include three layers, which are IaaS, PaaS, and SaaS. The method of how to deploy PPIP based on cloud computing is proposed finally.

  3. Computer problem-solving coaches for introductory physics: Design and usability studies

    Science.gov (United States)

    Ryan, Qing X.; Frodermann, Evan; Heller, Kenneth; Hsu, Leonardo; Mason, Andrew

    2016-06-01

    The combination of modern computing power, the interactivity of web applications, and the flexibility of object-oriented programming may finally be sufficient to create computer coaches that can help students develop metacognitive problem-solving skills, an important competence in our rapidly changing technological society. However, no matter how effective such coaches might be, they will only be useful if they are attractive to students. We describe the design and testing of a set of web-based computer programs that act as personal coaches to students while they practice solving problems from introductory physics. The coaches are designed to supplement regular human instruction, giving students access to effective forms of practice outside class. We present results from large-scale usability tests of the computer coaches and discuss their implications for future versions of the coaches.

  4. [Evaluation of production and clinical working time of computer-aided design/computer-aided manufacturing (CAD/CAM) custom trays for complete denture].

    Science.gov (United States)

    Wei, L; Chen, H; Zhou, Y S; Sun, Y C; Pan, S X

    2017-02-18

    To compare the technician fabrication time and clinical working time of custom trays fabricated using two different methods, the three-dimensional printing custom trays and the conventional custom trays, and to prove the feasibility of the computer-aided design/computer-aided manufacturing (CAD/CAM) custom trays in clinical use from the perspective of clinical time cost. Twenty edentulous patients were recruited into this study, which was prospective, single blind, randomized self-control clinical trials. Two custom trays were fabricated for each participant. One of the custom trays was fabricated using functional suitable denture (FSD) system through CAD/CAM process, and the other was manually fabricated using conventional methods. Then the final impressions were taken using both the custom trays, followed by utilizing the final impression to fabricate complete dentures respectively. The technician production time of the custom trays and the clinical working time of taking the final impression was recorded. The average time spent on fabricating the three-dimensional printing custom trays using FSD system and fabricating the conventional custom trays manually were (28.6±2.9) min and (31.1±5.7) min, respectively. The average time spent on making the final impression with the three-dimensional printing custom trays using FSD system and the conventional custom trays fabricated manually were (23.4±11.5) min and (25.4±13.0) min, respectively. There was significant difference in the technician fabrication time and the clinical working time between the three-dimensional printing custom trays using FSD system and the conventional custom trays fabricated manually (Pmanufacture custom trays by three-dimensional printing method, there is no need to pour preliminary cast after taking the primary impression, therefore, it can save the impression material and model material. As to completing denture restoration, manufacturing custom trays using FSD system is worth being

  5. Computing Optical Variable Periods of BL Lac Object S5 0716+ 714 ...

    Indian Academy of Sciences (India)

    Computing Optical Variable Periods of BL Lac Object S5 0716+ 714 ... The study of long-term periodical variation is an important way to get the charac- ... continuous Fourier transform together, define a window function, and finally obtain.

  6. Improvement of level-1 PSA computer code package

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Woon; Park, C. K.; Kim, K. Y.; Han, S. H.; Jung, W. D.; Chang, S. C.; Yang, J. E.; Sung, T. Y.; Kang, D. I.; Park, J. H.; Lee, Y. H.; Kim, S. H.; Hwang, M. J.; Choi, S. Y.

    1997-07-01

    This year the fifth (final) year of the phase-I of the Government-sponsored Mid- and Long-term Nuclear Power Technology Development Project. The scope of this subproject titled on `The improvement of level-1 PSA Computer Codes` is divided into two main activities : (1) improvement of level-1 PSA methodology, (2) development of applications methodology of PSA techniques to operations and maintenance of nuclear power plant. Level-1 PSA code KIRAP is converted to PC-Windows environment. For the improvement of efficiency in performing PSA, the fast cutset generation algorithm and an analytical technique for handling logical loop in fault tree modeling are developed. Using about 30 foreign generic data sources, generic component reliability database (GDB) are developed considering dependency among source data. A computer program which handles dependency among data sources are also developed based on three stage bayesian updating technique. Common cause failure (CCF) analysis methods are reviewed and CCF database are established. Impact vectors can be estimated from this CCF database. A computer code, called MPRIDP, which handles CCF database are also developed. A CCF analysis reflecting plant-specific defensive strategy against CCF event is also performed. A risk monitor computer program, called Risk Monster, are being developed for the application to the operation and maintenance of nuclear power plant. The PSA application technique is applied to review the feasibility study of on-line maintenance and to the prioritization of in-service test (IST) of motor-operated valves (MOV). Finally, the root cause analysis (RCA) and reliability-centered maintenance (RCM) technologies are adopted and applied to the improvement of reliability of emergency diesel generators (EDG) of nuclear power plant. To help RCA and RCM analyses, two software programs are developed, which are EPIS and RAM Pro. (author). 129 refs., 20 tabs., 60 figs.

  7. Improvement of level-1 PSA computer code package

    International Nuclear Information System (INIS)

    Kim, Tae Woon; Park, C. K.; Kim, K. Y.; Han, S. H.; Jung, W. D.; Chang, S. C.; Yang, J. E.; Sung, T. Y.; Kang, D. I.; Park, J. H.; Lee, Y. H.; Kim, S. H.; Hwang, M. J.; Choi, S. Y.

    1997-07-01

    This year the fifth (final) year of the phase-I of the Government-sponsored Mid- and Long-term Nuclear Power Technology Development Project. The scope of this subproject titled on 'The improvement of level-1 PSA Computer Codes' is divided into two main activities : 1) improvement of level-1 PSA methodology, 2) development of applications methodology of PSA techniques to operations and maintenance of nuclear power plant. Level-1 PSA code KIRAP is converted to PC-Windows environment. For the improvement of efficiency in performing PSA, the fast cutset generation algorithm and an analytical technique for handling logical loop in fault tree modeling are developed. Using about 30 foreign generic data sources, generic component reliability database (GDB) are developed considering dependency among source data. A computer program which handles dependency among data sources are also developed based on three stage bayesian updating technique. Common cause failure (CCF) analysis methods are reviewed and CCF database are established. Impact vectors can be estimated from this CCF database. A computer code, called MPRIDP, which handles CCF database are also developed. A CCF analysis reflecting plant-specific defensive strategy against CCF event is also performed. A risk monitor computer program, called Risk Monster, are being developed for the application to the operation and maintenance of nuclear power plant. The PSA application technique is applied to review the feasibility study of on-line maintenance and to the prioritization of in-service test (IST) of motor-operated valves (MOV). Finally, the root cause analysis (RCA) and reliability-centered maintenance (RCM) technologies are adopted and applied to the improvement of reliability of emergency diesel generators (EDG) of nuclear power plant. To help RCA and RCM analyses, two software programs are developed, which are EPIS and RAM Pro. (author). 129 refs., 20 tabs., 60 figs

  8. Neurons to algorithms LDRD final report.

    Energy Technology Data Exchange (ETDEWEB)

    Rothganger, Fredrick H.; Aimone, James Bradley; Warrender, Christina E.; Trumbo, Derek

    2013-09-01

    Over the last three years the Neurons to Algorithms (N2A) LDRD project teams has built infrastructure to discover computational structures in the brain. This consists of a modeling language, a tool that enables model development and simulation in that language, and initial connections with the Neuroinformatics community, a group working toward similar goals. The approach of N2A is to express large complex systems like the brain as populations of a discrete part types that have specific structural relationships with each other, along with internal and structural dynamics. Such an evolving mathematical system may be able to capture the essence of neural processing, and ultimately of thought itself. This final report is a cover for the actual products of the project: the N2A Language Specification, the N2A Application, and a journal paper summarizing our methods.

  9. Cross-Border Litigation and ADR Mechanisms in Disputes Concerning Mobile Computing in the EU

    DEFF Research Database (Denmark)

    Savin, Andrej

    2011-01-01

    The aim of this paper is to discuss briefly how the EU rules on jurisdiction, choice of law and alternative dispute resolution in civil and commercial matters operate in the context of mobile computing. The article first looks at rules on jurisdiction in commercial disputes, both between businesses...... and between businesses and consumers. It then discusses the choice-of-law issues applicable to mobile computing. Finally, there is an examination of alternative dispute resolution as an alternative to regular courts in transactions involving mobile computing....

  10. Research progress on quantum informatics and quantum computation

    Science.gov (United States)

    Zhao, Yusheng

    2018-03-01

    Quantum informatics is an emerging interdisciplinary subject developed by the combination of quantum mechanics, information science, and computer science in the 1980s. The birth and development of quantum information science has far-reaching significance in science and technology. At present, the application of quantum information technology has become the direction of people’s efforts. The preparation, storage, purification and regulation, transmission, quantum coding and decoding of quantum state have become the hotspot of scientists and technicians, which have a profound impact on the national economy and the people’s livelihood, technology and defense technology. This paper first summarizes the background of quantum information science and quantum computer and the current situation of domestic and foreign research, and then introduces the basic knowledge and basic concepts of quantum computing. Finally, several quantum algorithms are introduced in detail, including Quantum Fourier transform, Deutsch-Jozsa algorithm, Shor’s quantum algorithm, quantum phase estimation.

  11. Advances in Computer Science and Education

    CERN Document Server

    Huang, Xiong

    2012-01-01

    CSE2011 is an integrated conference concentration its focus on computer science and education. In the proceeding, you can learn much more knowledge about computer science and education of researchers from all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned fields. In order to meet the high quality of Springer, AISC series, the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organizers had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful

  12. EMI Evaluation on Wireless Computer Devices in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Jae Ki; JI Yeong Hwa; Sung, Chan Ho

    2011-01-01

    Wireless computer devices, for example, mice and keyboards are widely used in various industries. However, I and C (instrumentation and control) equipment in nuclear power plants are very susceptible to the EMI (Electro-magnetic interference) and there are concerns regarding EMI induced transient caused by wireless computer devices which emit electromagnetic waves for communication. In this paper, industrial practices and nuclear related international standards are investigated to verify requirements of wireless devices. In addition, actual measurement and evaluation for the intensity of EMI of some commercially available wireless devices is performed to verify their compatibility in terms of EMI. Finally we suggest an appropriate method of using wireless computer devices in nuclear power plant control rooms for better office circumstances of operators

  13. Functional Programming in Computer Science

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Loren James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Davis, Marion Kei [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-01-19

    We explore functional programming through a 16-week internship at Los Alamos National Laboratory. Functional programming is a branch of computer science that has exploded in popularity over the past decade due to its high-level syntax, ease of parallelization, and abundant applications. First, we summarize functional programming by listing the advantages of functional programming languages over the usual imperative languages, and we introduce the concept of parsing. Second, we discuss the importance of lambda calculus in the theory of functional programming. Lambda calculus was invented by Alonzo Church in the 1930s to formalize the concept of effective computability, and every functional language is essentially some implementation of lambda calculus. Finally, we display the lasting products of the internship: additions to a compiler and runtime system for the pure functional language STG, including both a set of tests that indicate the validity of updates to the compiler and a compiler pass that checks for illegal instances of duplicate names.

  14. The quark gluon plasma: Lattice computations put to experimental test

    Indian Academy of Sciences (India)

    I describe how lattice computations are being used to extract experimentally relevant features of the quark gluon plasma. I deal specifically with relaxation times, photon emissivity, strangeness yields, event-by-event fluctuations of conserved quantities and hydrodynamic flow. Finally I give evidence that the plasma is rather ...

  15. Principles for the wise use of computers by children.

    Science.gov (United States)

    Straker, L; Pollock, C; Maslen, B

    2009-11-01

    Computer use by children at home and school is now common in many countries. Child computer exposure varies with the type of computer technology available and the child's age, gender and social group. This paper reviews the current exposure data and the evidence for positive and negative effects of computer use by children. Potential positive effects of computer use by children include enhanced cognitive development and school achievement, reduced barriers to social interaction, enhanced fine motor skills and visual processing and effective rehabilitation. Potential negative effects include threats to child safety, inappropriate content, exposure to violence, bullying, Internet 'addiction', displacement of moderate/vigorous physical activity, exposure to junk food advertising, sleep displacement, vision problems and musculoskeletal problems. The case for child specific evidence-based guidelines for wise use of computers is presented based on children using computers differently to adults, being physically, cognitively and socially different to adults, being in a state of change and development and the potential to impact on later adult risk. Progress towards child-specific guidelines is reported. Finally, a set of guideline principles is presented as the basis for more detailed guidelines on the physical, cognitive and social impact of computer use by children. The principles cover computer literacy, technology safety, child safety and privacy and appropriate social, cognitive and physical development. The majority of children in affluent communities now have substantial exposure to computers. This is likely to have significant effects on child physical, cognitive and social development. Ergonomics can provide and promote guidelines for wise use of computers by children and by doing so promote the positive effects and reduce the negative effects of computer-child, and subsequent computer-adult, interaction.

  16. Evolutionary Computation Methods and their applications in Statistics

    Directory of Open Access Journals (Sweden)

    Francesco Battaglia

    2013-05-01

    Full Text Available A brief discussion of the genesis of evolutionary computation methods, their relationship to artificial intelligence, and the contribution of genetics and Darwin’s theory of natural evolution is provided. Then, the main evolutionary computation methods are illustrated: evolution strategies, genetic algorithms, estimation of distribution algorithms, differential evolution, and a brief description of some evolutionary behavior methods such as ant colony and particle swarm optimization. We also discuss the role of the genetic algorithm for multivariate probability distribution random generation, rather than as a function optimizer. Finally, some relevant applications of genetic algorithm to statistical problems are reviewed: selection of variables in regression, time series model building, outlier identification, cluster analysis, design of experiments.

  17. Experimental and Computational Investigation of Triple-rotating Blades in a Mower Deck

    Science.gov (United States)

    Chon, Woochong; Amano, Ryoichi S.

    Experimental and computational studies were performed on the 1.27m wide three-spindle lawn mower deck with side discharge arrangement. Laser Doppler Velocimetry was used to measure the air velocity at 12 different sections under the mower deck. The high-speed video camera test provided valuable visual evidence of airflow and grass discharge patterns. The strain gages were attached at several predetermined locations of the mower blades to measure the strain. In computational fluid dynamics work, computer based analytical studies were performed. During this phase of work, two different trials were attempted. First, two-dimensional blade shapes at several arbitrary radial sections were selected for flow computations around the blade model. Finally, a three-dimensional full deck model was developed and compared with the experimental results.

  18. Stability and Hopf bifurcation for a delayed SLBRS computer virus model.

    Science.gov (United States)

    Zhang, Zizhen; Yang, Huizhong

    2014-01-01

    By incorporating the time delay due to the period that computers use antivirus software to clean the virus into the SLBRS model a delayed SLBRS computer virus model is proposed in this paper. The dynamical behaviors which include local stability and Hopf bifurcation are investigated by regarding the delay as bifurcating parameter. Specially, direction and stability of the Hopf bifurcation are derived by applying the normal form method and center manifold theory. Finally, an illustrative example is also presented to testify our analytical results.

  19. Stability and Hopf Bifurcation for a Delayed SLBRS Computer Virus Model

    Directory of Open Access Journals (Sweden)

    Zizhen Zhang

    2014-01-01

    Full Text Available By incorporating the time delay due to the period that computers use antivirus software to clean the virus into the SLBRS model a delayed SLBRS computer virus model is proposed in this paper. The dynamical behaviors which include local stability and Hopf bifurcation are investigated by regarding the delay as bifurcating parameter. Specially, direction and stability of the Hopf bifurcation are derived by applying the normal form method and center manifold theory. Finally, an illustrative example is also presented to testify our analytical results.

  20. THE PRODUCTION AND EVALUATION OF THREE COMPUTER-BASED ECONOMICS GAMES FOR THE SIXTH GRADE. FINAL REPORT.

    Science.gov (United States)

    WING, RICHARD L.; AND OTHERS

    THE PURPOSE OF THE EXPERIMENT WAS TO PRODUCE AND EVALUATE 3 COMPUTER-BASED ECONOMICS GAMES AS A METHOD OF INDIVIDUALIZING INSTRUCTION FOR GRADE 6 STUDENTS. 26 EXPERIMENTAL SUBJECTS PLAYED 2 ECONOMICS GAMES, WHILE A CONTROL GROUP RECEIVED CONVENTIONAL INSTRUCTION ON SIMILAR MATERIAL. IN THE SUMERIAN GAME, STUDENTS SEATED AT THE TYPEWRITER TERMINALS…

  1. A Scheme for Verification on Data Integrity in Mobile Multicloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Laicheng Cao

    2016-01-01

    Full Text Available In order to verify the data integrity in mobile multicloud computing environment, a MMCDIV (mobile multicloud data integrity verification scheme is proposed. First, the computability and nondegeneracy of verification can be obtained by adopting BLS (Boneh-Lynn-Shacham short signature scheme. Second, communication overhead is reduced based on HVR (Homomorphic Verifiable Response with random masking and sMHT (sequence-enforced Merkle hash tree construction. Finally, considering the resource constraints of mobile devices, data integrity is verified by lightweight computing and low data transmission. The scheme improves shortage that mobile device communication and computing power are limited, it supports dynamic data operation in mobile multicloud environment, and data integrity can be verified without using direct source file block. Experimental results also demonstrate that this scheme can achieve a lower cost of computing and communications.

  2. Factors that Influence the Success of Male and Female Computer Programming Students in College

    Science.gov (United States)

    Clinkenbeard, Drew A.

    As the demand for a technologically skilled work force grows, experience and skill in computer science have become increasingly valuable for college students. However, the number of students graduating with computer science degrees is not growing proportional to this need. Traditionally several groups are underrepresented in this field, notably women and students of color. This study investigated elements of computer science education that influence academic achievement in beginning computer programming courses. The goal of the study was to identify elements that increase success in computer programming courses. A 38-item questionnaire was developed and administered during the Spring 2016 semester at California State University Fullerton (CSUF). CSUF is an urban public university comprised of about 40,000 students. Data were collected from three beginning programming classes offered at CSUF. In total 411 questionnaires were collected resulting in a response rate of 58.63%. Data for the study were grouped into three broad categories of variables. These included academic and background variables; affective variables; and peer, mentor, and role-model variables. A conceptual model was developed to investigate how these variables might predict final course grade. Data were analyzed using statistical techniques such as linear regression, factor analysis, and path analysis. Ultimately this study found that peer interactions, comfort with computers, computer self-efficacy, self-concept, and perception of achievement were the best predictors of final course grade. In addition, the analyses showed that male students exhibited higher levels of computer self-efficacy and self-concept compared to female students, even when they achieved comparable course grades. Implications and explanations of these findings are explored, and potential policy changes are offered.

  3. Perceptual organization in computer vision - A review and a proposal for a classificatory structure

    Science.gov (United States)

    Sarkar, Sudeep; Boyer, Kim L.

    1993-01-01

    The evolution of perceptual organization in biological vision, and its necessity in advanced computer vision systems, arises from the characteristic that perception, the extraction of meaning from sensory input, is an intelligent process. This is particularly so for high order organisms and, analogically, for more sophisticated computational models. The role of perceptual organization in computer vision systems is explored. This is done from four vantage points. First, a brief history of perceptual organization research in both humans and computer vision is offered. Next, a classificatory structure in which to cast perceptual organization research to clarify both the nomenclature and the relationships among the many contributions is proposed. Thirdly, the perceptual organization work in computer vision in the context of this classificatory structure is reviewed. Finally, the array of computational techniques applied to perceptual organization problems in computer vision is surveyed.

  4. Introduction of e-learning in dental radiology reveals significantly improved results in final examination.

    Science.gov (United States)

    Meckfessel, Sandra; Stühmer, Constantin; Bormann, Kai-Hendrik; Kupka, Thomas; Behrends, Marianne; Matthies, Herbert; Vaske, Bernhard; Stiesch, Meike; Gellrich, Nils-Claudius; Rücker, Martin

    2011-01-01

    Because a traditionally instructed dental radiology lecture course is very time-consuming and labour-intensive, online courseware, including an interactive-learning module, was implemented to support the lectures. The purpose of this study was to evaluate the perceptions of students who have worked with web-based courseware as well as the effect on their results in final examinations. Users (n(3+4)=138) had access to the e-program from any networked computer at any time. Two groups (n(3)=71, n(4)=67) had to pass a final exam after using the e-course. Results were compared with two groups (n(1)=42, n(2)=48) who had studied the same content by attending traditional lectures. In addition a survey of the students was statistically evaluated. Most of the respondents reported a positive attitude towards e-learning and would have appreciated more access to computer-assisted instruction. Two years after initiating the e-course the failure rate in the final examination dropped significantly, from 40% to less than 2%. The very positive response to the e-program and improved test scores demonstrated the effectiveness of our e-course as a learning aid. Interactive modules in step with clinical practice provided learning that is not achieved by traditional teaching methods alone. To what extent staff savings are possible is part of a further study. Copyright © 2010 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  5. Fuzzy logic, neural networks, and soft computing

    Science.gov (United States)

    Zadeh, Lofti A.

    1994-01-01

    The past few years have witnessed a rapid growth of interest in a cluster of modes of modeling and computation which may be described collectively as soft computing. The distinguishing characteristic of soft computing is that its primary aims are to achieve tractability, robustness, low cost, and high MIQ (machine intelligence quotient) through an exploitation of the tolerance for imprecision and uncertainty. Thus, in soft computing what is usually sought is an approximate solution to a precisely formulated problem or, more typically, an approximate solution to an imprecisely formulated problem. A simple case in point is the problem of parking a car. Generally, humans can park a car rather easily because the final position of the car is not specified exactly. If it were specified to within, say, a few millimeters and a fraction of a degree, it would take hours or days of maneuvering and precise measurements of distance and angular position to solve the problem. What this simple example points to is the fact that, in general, high precision carries a high cost. The challenge, then, is to exploit the tolerance for imprecision by devising methods of computation which lead to an acceptable solution at low cost. By its nature, soft computing is much closer to human reasoning than the traditional modes of computation. At this juncture, the major components of soft computing are fuzzy logic (FL), neural network theory (NN), and probabilistic reasoning techniques (PR), including genetic algorithms, chaos theory, and part of learning theory. Increasingly, these techniques are used in combination to achieve significant improvement in performance and adaptability. Among the important application areas for soft computing are control systems, expert systems, data compression techniques, image processing, and decision support systems. It may be argued that it is soft computing, rather than the traditional hard computing, that should be viewed as the foundation for artificial

  6. Identity based Encryption and Biometric Authentication Scheme for Secure Data Access in Cloud Computing

    DEFF Research Database (Denmark)

    Cheng, Hongbing; Rong, Chunming; Tan, Zheng-Hua

    2012-01-01

    Cloud computing will be a main information infrastructure in the future; it consists of many large datacenters which are usually geographically distributed and heterogeneous. How to design a secure data access for cloud computing platform is a big challenge. In this paper, we propose a secure data...... access scheme based on identity-based encryption and biometric authentication for cloud computing. Firstly, we describe the security concern of cloud computing and then propose an integrated data access scheme for cloud computing, the procedure of the proposed scheme include parameter setup, key...... distribution, feature template creation, cloud data processing and secure data access control. Finally, we compare the proposed scheme with other schemes through comprehensive analysis and simulation. The results show that the proposed data access scheme is feasible and secure for cloud computing....

  7. Narrative Finality

    Directory of Open Access Journals (Sweden)

    Armine Kotin Mortimer

    1981-01-01

    Full Text Available The cloturai device of narration as salvation represents the lack of finality in three novels. In De Beauvoir's Tous les hommes sont mortels an immortal character turns his story to account, but the novel makes a mockery of the historical sense by which men define themselves. In the closing pages of Butor's La Modification , the hero plans to write a book to save himself. Through the thrice-considered portrayal of the Paris-Rome relationship, the ending shows the reader how to bring about closure, but this collective critique written by readers will always be a future book. Simon's La Bataille de Pharsale , the most radical attempt to destroy finality, is an infinite text. No new text can be written. This extreme of perversion guarantees bliss (jouissance . If the ending of De Beauvoir's novel transfers the burden of non-final world onto a new victim, Butor's non-finality lies in the deferral to a future writing, while Simon's writer is stuck in a writing loop, in which writing has become its own end and hence can have no end. The deconstructive and tragic form of contemporary novels proclaims the loss of belief in a finality inherent in the written text, to the profit of writing itself.

  8. Development of the computer-aided process planning (CAPP system for polymer injection molds manufacturing

    Directory of Open Access Journals (Sweden)

    J. Tepić

    2011-10-01

    Full Text Available Beginning of production and selling of polymer products largely depends on mold manufacturing. The costs of mold manufacturing have significant share in the final price of a product. The best way to improve and rationalize polymer injection molds production process is by doing mold design automation and manufacturing process planning automation. This paper reviews development of a dedicated process planning system for manufacturing of the mold for injection molding, which integrates computer-aided design (CAD, computer-aided process planning (CAPP and computer-aided manufacturing (CAM technologies.

  9. Computer Aided Design System for Developing Musical Fountain Programs

    Institute of Scientific and Technical Information of China (English)

    刘丹; 张乃尧; 朱汉城

    2003-01-01

    A computer aided design system for developing musical fountain programs was developed with multiple functions such as intelligent design, 3-D animation, manual modification and synchronized motion to make the development process more efficient. The system first analyzed the music form and sentiment using many basic features of the music to select a basic fountain program. Then, this program is simulated with 3-D animation and modified manually to achieve the desired results. Finally, the program is transformed to a computer control program to control the musical fountain in time with the music. A prototype system for the musical fountain was also developed. It was tested with many styles of music and users were quite satisfied with its performance. By integrating various functions, the proposed computer aided design system for developing musical fountain programs greatly simplified the design of the musical fountain programs.

  10. A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing.

    Science.gov (United States)

    Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang

    2017-07-24

    With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient.

  11. Substituting computers for services - potential to reduce ICT's environmental footprint

    Energy Technology Data Exchange (ETDEWEB)

    Plepys, A. [The International Inst. for Industrial Environmental Economics at Lund Univ. (Sweden)

    2004-07-01

    The environmental footprint of IT products are significant and, in spite of manufacturing and product design improvements, growing consumption of electronics results in increasing absolute environmental impact. Computers have short technological lifespan and a lot of the in-build performance, although necessary, remains idling for most of the time. Today, most of computers used in non-residential sectors are connected to networks. The premise of this paper is that computer networks are an untapped resource, which could allow addressing environmental impacts of IT products through centralising and sharing computing resources. The article presents results of a comparative study of two computing architectures. The first one is the traditional decentralised PC-based system and the second - centralised server-based computing (SBC) system. Both systems deliver equivalent functions to the final users and this can be compared on a one-to-one basis. The study evaluates product lifespan, energy consumption in user stage, product design and its environmental implications in manufacturing. (orig.)

  12. A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing

    Science.gov (United States)

    Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang

    2017-01-01

    With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient. PMID:28737733

  13. Modeling the state dependent impulse control for computer virus propagation under media coverage

    Science.gov (United States)

    Liang, Xiyin; Pei, Yongzhen; Lv, Yunfei

    2018-02-01

    A state dependent impulsive control model is proposed to model the spread of computer virus incorporating media coverage. By the successor function, the sufficient conditions for the existence and uniqueness of order-1 periodic solution are presented first. Secondly, for two classes of periodic solutions, the geometric property of successor function and the analogue of the Poincaré criterion are employed to obtain the stability results. These results show that the number of the infective computers is under the threshold all the time. Finally, the theoretic and numerical analysis show that media coverage can delay the spread of computer virus.

  14. The Impact of Machine Translation and Computer-aided Translation on Translators

    Science.gov (United States)

    Peng, Hao

    2018-03-01

    Under the context of globalization, communications between countries and cultures are becoming increasingly frequent, which make it imperative to use some techniques to help translate. This paper is to explore the influence of computer-aided translation on translators, which is derived from the field of the computer-aided translation (CAT) and machine translation (MT). Followed by an introduction to the development of machine and computer-aided translation, it then depicts the technologies practicable to translators, which are trying to analyze the demand of designing the computer-aided translation so far in translation practice, and optimize the designation of computer-aided translation techniques, and analyze its operability in translation. The findings underline the advantages and disadvantages of MT and CAT tools, and the serviceability and future development of MT and CAT technologies. Finally, this thesis probes into the impact of these new technologies on translators in hope that more translators and translation researchers can learn to use such tools to improve their productivity.

  15. On-Demand Final State Control of a Surface-Bound Bistable Single Molecule Switch.

    Science.gov (United States)

    Garrido Torres, José A; Simpson, Grant J; Adams, Christopher J; Früchtl, Herbert A; Schaub, Renald

    2018-04-12

    Modern electronic devices perform their defined action because of the complete reliability of their individual active components (transistors, switches, diodes, and so forth). For instance, to encode basic computer units (bits) an electrical switch can be used. The reliability of the switch ensures that the desired outcome (the component's final state, 0 or 1) can be selected with certainty. No practical data storage device would otherwise exist. This reliability criterion will necessarily need to hold true for future molecular electronics to have the opportunity to emerge as a viable miniaturization alternative to our current silicon-based technology. Molecular electronics target the use of single-molecules to perform the actions of individual electronic components. On-demand final state control over a bistable unimolecular component has therefore been one of the main challenges in the past decade (1-5) but has yet to be achieved. In this Letter, we demonstrate how control of the final state of a surface-supported bistable single molecule switch can be realized. On the basis of the observations and deductions presented here, we further suggest an alternative strategy to achieve final state control in unimolecular bistable switches.

  16. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Gurney, Kevin R. [Arizona Univ., Mesa, AZ (United States)

    2015-01-12

    This document constitutes the final report under DOE grant DE-FG-08ER64649. The organization of this document is as follows: first, I will review the original scope of the proposed research. Second, I will present the current draft of a paper nearing submission to Nature Climate Change on the initial results of this funded effort. Finally, I will present the last phase of the research under this grant which has supported a Ph.D. student. To that end, I will present the graduate student’s proposed research, a portion of which is completed and reflected in the paper nearing submission. This final work phase will be completed in the next 12 months. This final workphase will likely result in 1-2 additional publications and we consider the results (as exemplified by the current paper) high quality. The continuing results will acknowledge the funding provided by DOE grant DE-FG-08ER64649.

  17. Opportunities for Russian Nuclear Weapons Institute developing computer-aided design programs for pharmaceutical drug discovery. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-09-23

    The goal of this study is to determine whether physicists at the Russian Nuclear Weapons Institute can profitably service the need for computer aided drug design (CADD) programs. The Russian physicists` primary competitive advantage is their ability to write particularly efficient code able to work with limited computing power; a history of working with very large, complex modeling systems; an extensive knowledge of physics and mathematics, and price competitiveness. Their primary competitive disadvantage is their lack of biology, and cultural and geographic issues. The first phase of the study focused on defining the competitive landscape, primarily through interviews with and literature searches on the key providers of CADD software. The second phase focused on users of CADD technology to determine deficiencies in the current product offerings, to understand what product they most desired, and to define the potential demand for such a product.

  18. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  19. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  20. Ubiquitous computing in sports: A review and analysis.

    Science.gov (United States)

    Baca, Arnold; Dabnichki, Peter; Heller, Mario; Kornfeind, Philipp

    2009-10-01

    Ubiquitous (pervasive) computing is a term for a synergetic use of sensing, communication and computing. Pervasive use of computing has seen a rapid increase in the current decade. This development has propagated in applied sport science and everyday life. The work presents a survey of recent developments in sport and leisure with emphasis on technology and computational techniques. A detailed analysis on new technological developments is performed. Sensors for position and motion detection, and such for equipment and physiological monitoring are discussed. Aspects of novel trends in communication technologies and data processing are outlined. Computational advancements have started a new trend - development of smart and intelligent systems for a wide range of applications - from model-based posture recognition to context awareness algorithms for nutrition monitoring. Examples particular to coaching and training are discussed. Selected tools for monitoring rules' compliance and automatic decision-making are outlined. Finally, applications in leisure and entertainment are presented, from systems supporting physical activity to systems providing motivation. It is concluded that the emphasis in future will shift from technologies to intelligent systems that allow for enhanced social interaction as efforts need to be made to improve user-friendliness and standardisation of measurement and transmission protocols.

  1. A General Cross-Layer Cloud Scheduling Framework for Multiple IoT Computer Tasks.

    Science.gov (United States)

    Wu, Guanlin; Bao, Weidong; Zhu, Xiaomin; Zhang, Xiongtao

    2018-05-23

    The diversity of IoT services and applications brings enormous challenges to improving the performance of multiple computer tasks' scheduling in cross-layer cloud computing systems. Unfortunately, the commonly-employed frameworks fail to adapt to the new patterns on the cross-layer cloud. To solve this issue, we design a new computer task scheduling framework for multiple IoT services in cross-layer cloud computing systems. Specifically, we first analyze the features of the cross-layer cloud and computer tasks. Then, we design the scheduling framework based on the analysis and present detailed models to illustrate the procedures of using the framework. With the proposed framework, the IoT services deployed in cross-layer cloud computing systems can dynamically select suitable algorithms and use resources more effectively to finish computer tasks with different objectives. Finally, the algorithms are given based on the framework, and extensive experiments are also given to validate its effectiveness, as well as its superiority.

  2. Analyses, algorithms, and computations for models of high-temperature superconductivity. Final report

    International Nuclear Information System (INIS)

    Du, Q.

    1997-01-01

    Under the sponsorship of the Department of Energy, the authors have achieved significant progress in the modeling, analysis, and computation of superconducting phenomena. The work so far has focused on mezoscale models as typified by the celebrated Ginzburg-Landau equations; these models are intermediate between the microscopic models (that can be used to understand the basic structure of superconductors and of the atomic and sub-atomic behavior of these materials) and the macroscale, or homogenized, models (that can be of use for the design of devices). The models they have considered include a time dependent Ginzburg-Landau model, a variable thickness thin film model, models for high values of the Ginzburg-landau parameter, models that account for normal inclusions and fluctuations and Josephson effects, and the anisotropic ginzburg-Landau and Lawrence-Doniach models for layered superconductors, including those with high critical temperatures. In each case, they have developed or refined the models, derived rigorous mathematical results that enhance the state of understanding of the models and their solutions, and developed, analyzed, and implemented finite element algorithms for the approximate solution of the model equations

  3. Multi-Point Combustion System: Final Report

    Science.gov (United States)

    Goeke, Jerry; Pack, Spencer; Zink, Gregory; Ryon, Jason

    2014-01-01

    A low-NOx emission combustor concept has been developed for NASA's Environmentally Responsible Aircraft (ERA) program to meet N+2 emissions goals for a 70,000 lb thrust engine application. These goals include 75 percent reduction of LTO NOx from CAEP6 standards without increasing CO, UHC, or smoke from that of current state of the art. An additional key factor in this work is to improve lean combustion stability over that of previous work performed on similar technology in the early 2000s. The purpose of this paper is to present the final report for the NASA contract. This work included the design, analysis, and test of a multi-point combustion system. All design work was based on the results of Computational Fluid Dynamics modeling with the end results tested on a medium pressure combustion rig at the UC and a medium pressure combustion rig at GRC. The theories behind the designs, results of analysis, and experimental test data will be discussed in this report. The combustion system consists of five radially staged rows of injectors, where ten small scale injectors are used in place of a single traditional nozzle. Major accomplishments of the current work include the design of a Multipoint Lean Direct Injection (MLDI) array and associated air blast and pilot fuel injectors, which is expected to meet or exceed the goal of a 75 percent reduction in LTO NOx from CAEP6 standards. This design incorporates a reduced number of injectors over previous multipoint designs, simplified and lightweight components, and a very compact combustor section. Additional outcomes of the program are validation that the design of these combustion systems can be aided by the use of Computational Fluid Dynamics to predict and reduce emissions. Furthermore, the staging of fuel through the individually controlled radially staged injector rows successfully demonstrated improved low power operability as well as improvements in emissions over previous multipoint designs. Additional comparison

  4. Computational biology and bioinformatics in Nigeria.

    Science.gov (United States)

    Fatumo, Segun A; Adoga, Moses P; Ojo, Opeolu O; Oluwagbemi, Olugbenga; Adeoye, Tolulope; Ewejobi, Itunuoluwa; Adebiyi, Marion; Adebiyi, Ezekiel; Bewaji, Clement; Nashiru, Oyekanmi

    2014-04-01

    Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries.

  5. Computational biology and bioinformatics in Nigeria.

    Directory of Open Access Journals (Sweden)

    Segun A Fatumo

    2014-04-01

    Full Text Available Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries.

  6. Analysis of the computed tomography in the acute abdomen

    International Nuclear Information System (INIS)

    Hochhegger, Bruno; Moraes, Everton; Haygert, Carlos Jesus Pereira; Antunes, Paulo Sergio Pase; Gazzoni, Fernando; Lopes, Luis Felipe Dias

    2007-01-01

    Introduction: This study tends to test the capacity of the computed tomography in assist in the diagnosis and the approach of the acute abdomen. Material and method: This is a longitudinal and prospective study, in which were analyzed the patients with the diagnosis of acute abdomen. There were obtained 105 cases of acute abdomen and after the application of the exclusions criteria were included 28 patients in the study. Results: Computed tomography changed the diagnostic hypothesis of the physicians in 50% of the cases (p 0.05), where 78.57% of the patients had surgical indication before computed tomography and 67.86% after computed tomography (p = 0.0546). The index of accurate diagnosis of computed tomography, when compared to the anatomopathologic examination and the final diagnosis, was observed in 82.14% of the cases (p = 0.013). When the analysis was done dividing the patients in surgical and nonsurgical group, were obtained an accuracy of 89.28% (p 0.0001). The difference of 7.2 days of hospitalization (p = 0.003) was obtained compared with the mean of the acute abdomen without use the computed tomography. Conclusion: The computed tomography is correlative with the anatomopathology and has great accuracy in the surgical indication, associated with the capacity of increase the confident index of the physicians, reduces the hospitalization time, reduces the number of surgeries and is cost-effective. (author)

  7. Toward a computational theory of conscious processing.

    Science.gov (United States)

    Dehaene, Stanislas; Charles, Lucie; King, Jean-Rémi; Marti, Sébastien

    2014-04-01

    The study of the mechanisms of conscious processing has become a productive area of cognitive neuroscience. Here we review some of the recent behavioral and neuroscience data, with the specific goal of constraining present and future theories of the computations underlying conscious processing. Experimental findings imply that most of the brain's computations can be performed in a non-conscious mode, but that conscious perception is characterized by an amplification, global propagation and integration of brain signals. A comparison of these data with major theoretical proposals suggests that firstly, conscious access must be carefully distinguished from selective attention; secondly, conscious perception may be likened to a non-linear decision that 'ignites' a network of distributed areas; thirdly, information which is selected for conscious perception gains access to additional computations, including temporary maintenance, global sharing, and flexible routing; and finally, measures of the complexity, long-distance correlation and integration of brain signals provide reliable indices of conscious processing, clinically relevant to patients recovering from coma. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. INTERRUPTION TO COMPUTING SERVICES, SATURDAY 9 FEBRUARY

    CERN Multimedia

    2002-01-01

    In order to allow the rerouting of electrical cables which power most of the B513 Computer Room, there will be a complete shutdown of central computing services on Saturday 9 February. This shutdown affects all Central Computing services as well as the general purpose site wide network (137.138), and hence desktop connectivity will also be cut. In order for the physical intervention to start at 07:30, services will be run down as follows (which has been planned to allow the completion of the Friday night backups of AFS and Oracle): Friday 08/02 17:00 Scheduling of batch jobs suspended 18:00 HPSS services shutdown Saturday 09/02 01:00 Any batch jobs that have not ended will be terminated 04:00 Rundown of general purpose services including Mail, Web, NICE, Unix interactive services, ACB, CASTOR, Tape Service and finally Oracle 06:30 AIS services shutdown (including EDH, BHT and HRT) 07:00 AFS service shutdown 07:30 End of guaranteed service on the general purpose site-wide network (137.138) Services will be res...

  9. Nuclear Computational Low Energy Initiative (NUCLEI)

    Energy Technology Data Exchange (ETDEWEB)

    Reddy, Sanjay K. [University of Washington

    2017-08-14

    This is the final report for University of Washington for the NUCLEI SciDAC-3. The NUCLEI -project, as defined by the scope of work, will develop, implement and run codes for large-scale computations of many topics in low-energy nuclear physics. Physics to be studied include the properties of nuclei and nuclear decays, nuclear structure and reactions, and the properties of nuclear matter. The computational techniques to be used include Quantum Monte Carlo, Configuration Interaction, Coupled Cluster, and Density Functional methods. The research program will emphasize areas of high interest to current and possible future DOE nuclear physics facilities, including ATLAS and FRIB (nuclear structure and reactions, and nuclear astrophysics), TJNAF (neutron distributions in nuclei, few body systems, and electroweak processes), NIF (thermonuclear reactions), MAJORANA and FNPB (neutrino-less double-beta decay and physics beyond the Standard Model), and LANSCE (fission studies).

  10. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Schuur, Edward [Northern Arizona Univ., Flagstaff, AZ (United States); Luo, Yiqi [Univ. of Oklahoma, Norman, OK (United States)

    2016-12-01

    This final grant report is a continuation of the final grant report submitted for DE-SC0006982 as the Principle Investigator (Schuur) relocated from the University of Florida to Northern Arizona University. This report summarizes the original project goals, as well as includes new project activities that were completed in the final period of the project.

  11. Parallel computation with molecular-motor-propelled agents in nanofabricated networks.

    Science.gov (United States)

    Nicolau, Dan V; Lard, Mercy; Korten, Till; van Delft, Falco C M J M; Persson, Malin; Bengtsson, Elina; Månsson, Alf; Diez, Stefan; Linke, Heiner; Nicolau, Dan V

    2016-03-08

    The combinatorial nature of many important mathematical problems, including nondeterministic-polynomial-time (NP)-complete problems, places a severe limitation on the problem size that can be solved with conventional, sequentially operating electronic computers. There have been significant efforts in conceiving parallel-computation approaches in the past, for example: DNA computation, quantum computation, and microfluidics-based computation. However, these approaches have not proven, so far, to be scalable and practical from a fabrication and operational perspective. Here, we report the foundations of an alternative parallel-computation system in which a given combinatorial problem is encoded into a graphical, modular network that is embedded in a nanofabricated planar device. Exploring the network in a parallel fashion using a large number of independent, molecular-motor-propelled agents then solves the mathematical problem. This approach uses orders of magnitude less energy than conventional computers, thus addressing issues related to power consumption and heat dissipation. We provide a proof-of-concept demonstration of such a device by solving, in a parallel fashion, the small instance {2, 5, 9} of the subset sum problem, which is a benchmark NP-complete problem. Finally, we discuss the technical advances necessary to make our system scalable with presently available technology.

  12. Optimal Computing Resource Management Based on Utility Maximization in Mobile Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Haoyu Meng

    2017-01-01

    Full Text Available Mobile crowdsourcing, as an emerging service paradigm, enables the computing resource requestor (CRR to outsource computation tasks to each computing resource provider (CRP. Considering the importance of pricing as an essential incentive to coordinate the real-time interaction among the CRR and CRPs, in this paper, we propose an optimal real-time pricing strategy for computing resource management in mobile crowdsourcing. Firstly, we analytically model the CRR and CRPs behaviors in form of carefully selected utility and cost functions, based on concepts from microeconomics. Secondly, we propose a distributed algorithm through the exchange of control messages, which contain the information of computing resource demand/supply and real-time prices. We show that there exist real-time prices that can align individual optimality with systematic optimality. Finally, we also take account of the interaction among CRPs and formulate the computing resource management as a game with Nash equilibrium achievable via best response. Simulation results demonstrate that the proposed distributed algorithm can potentially benefit both the CRR and CRPs. The coordinator in mobile crowdsourcing can thus use the optimal real-time pricing strategy to manage computing resources towards the benefit of the overall system.

  13. The Particle Beam Optics Interactive Computer Laboratory

    International Nuclear Information System (INIS)

    Gillespie, George H.; Hill, Barrey W.; Brown, Nathan A.; Babcock, R. Chris; Martono, Hendy; Carey, David C.

    1997-01-01

    The Particle Beam Optics Interactive Computer Laboratory (PBO Lab) is an educational software concept to aid students and professionals in learning about charged particle beams and particle beam optical systems. The PBO Lab is being developed as a cross-platform application and includes four key elements. The first is a graphic user interface shell that provides for a highly interactive learning session. The second is a knowledge database containing information on electric and magnetic optics transport elements. The knowledge database provides interactive tutorials on the fundamental physics of charged particle optics and on the technology used in particle optics hardware. The third element is a graphical construction kit that provides tools for students to interactively and visually construct optical beamlines. The final element is a set of charged particle optics computational engines that compute trajectories, transport beam envelopes, fit parameters to optical constraints and carry out similar calculations for the student designed beamlines. The primary computational engine is provided by the third-order TRANSPORT code. Augmenting TRANSPORT is the multiple ray tracing program TURTLE and a first-order matrix program that includes a space charge model and support for calculating single particle trajectories in the presence of the beam space charge. This paper describes progress on the development of the PBO Lab

  14. Evaluation on correction factor for in-line X-ray phase contrast computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Mingli; Huang, Zhifeng; Zhang, Li; Zhang, Ran [Tsinghua Univ., Beijing (China). Dept. of Engineering Physics; Ministry of Education, Beijing (China). Key Laboratory of Particle and Radiation Imaging; Yin, Hongxia; Liu, Yunfu; Wang, Zhenchang [Capital Medical Univ., Beijing (China). Medical Imaging Center; Xiao, Tiqiao [Chinese Academy of Sciences, Shanghai (China). Shanghai Inst. of Applied Physics

    2011-07-01

    X-ray in-line phase contrast computed tomography (CT) is an effective nondestructive tool, providing 3D distribution of the refractive index of weakly absorbing low-Z object with high resolution and image contrast, especially with high-brilliance third-generation synchrotron radiation sources. Modified Bronnikov's algorithm (MBA), one of the in-line phase contrast CT reconstruction algorithms, can reconstruct the refractive index distribution of a pure phase object with a single computed tomographic data set. The key idea of the MBA is to use a correction factor in the filter function to stabilize the behavior at low frequencies. In this paper, we evaluate the influences of the correction factor to the final reconstruction results of the absorption-phase-mixed objects with analytical simulation and actual experiments. The limitations of the MBA are discussed finally. (orig.)

  15. Scientific Computing Kernels on the Cell Processor

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Samuel W.; Shalf, John; Oliker, Leonid; Kamil, Shoaib; Husbands, Parry; Yelick, Katherine

    2007-04-04

    The slowing pace of commodity microprocessor performance improvements combined with ever-increasing chip power demands has become of utmost concern to computational scientists. As a result, the high performance computing community is examining alternative architectures that address the limitations of modern cache-based designs. In this work, we examine the potential of using the recently-released STI Cell processor as a building block for future high-end computing systems. Our work contains several novel contributions. First, we introduce a performance model for Cell and apply it to several key scientific computing kernels: dense matrix multiply, sparse matrix vector multiply, stencil computations, and 1D/2D FFTs. The difficulty of programming Cell, which requires assembly level intrinsics for the best performance, makes this model useful as an initial step in algorithm design and evaluation. Next, we validate the accuracy of our model by comparing results against published hardware results, as well as our own implementations on a 3.2GHz Cell blade. Additionally, we compare Cell performance to benchmarks run on leading superscalar (AMD Opteron), VLIW (Intel Itanium2), and vector (Cray X1E) architectures. Our work also explores several different mappings of the kernels and demonstrates a simple and effective programming model for Cell's unique architecture. Finally, we propose modest microarchitectural modifications that could significantly increase the efficiency of double-precision calculations. Overall results demonstrate the tremendous potential of the Cell architecture for scientific computations in terms of both raw performance and power efficiency.

  16. Quantum computation via local control theory: Direct sum vs. direct product Hilbert spaces

    International Nuclear Information System (INIS)

    Sklarz, Shlomo E.; Tannor, David J.

    2006-01-01

    The central objective in any quantum computation is the creation of a desired unitary transformation; the mapping that this unitary transformation produces between the input and output states is identified with the computation. In [S.E. Sklarz, D.J. Tannor, arXiv:quant-ph/0404081 (submitted to PRA) (2004)] it was shown that local control theory can be used to calculate fields that will produce such a desired unitary transformation. In contrast with previous strategies for quantum computing based on optimal control theory, the local control scheme maintains the system within the computational subspace at intermediate times, thereby avoiding unwanted decay processes. In [S.E. Sklarz et al.], the structure of the Hilbert space had a direct sum structure with respect to the computational register and the mediating states. In this paper, we extend the formalism to the important case of a direct product Hilbert space. The final equations for the control algorithm for the two cases are remarkably similar in structure, despite the fact that the derivations are completely different and that in one case the dynamics is in a Hilbert space and in the other case the dynamics is in a Liouville space. As shown in [S.E. Sklarz et al.], the direct sum implementation leads to a computational mechanism based on virtual transitions, and can be viewed as an extension of the principles of Stimulated Raman Adiabatic Passage from state manipulation to evolution operator manipulation. The direct product implementation developed here leads to the intriguing concept of virtual entanglement - computation that exploits second-order transitions that pass through entangled states but that leaves the subsystems nearly separable at all intermediate times. Finally, we speculate on a connection between the algorithm developed here and the concept of decoherence free subspaces

  17. Analysis of airborne radiometric data. Volume 2. Description, listing, and operating instructions for the code DELPHI/MAZAS. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Sperling, M.; Shreve, D.C.

    1978-12-01

    The computer code DELPHI is an interactive English language command system for the analysis of airborne radiometric data. The code includes modules for data reduction, data simulation, time filtering, data adjustment and graphical presentation of the results. DELPHI is implemented in FORTRAN on a DEC-10 computer. This volume gives a brief set of operations instructions, samples of the output obtained from hard copies of the display on a Tektronix terminal and finally a listing of the code.

  18. Analysis of airborne radiometric data. Volume 2. Description, listing, and operating instructions for the code DELPHI/MAZAS. Final report

    International Nuclear Information System (INIS)

    Sperling, M.; Shreve, D.C.

    1978-01-01

    The computer code DELPHI is an interactive English language command system for the analysis of airborne radiometric data. The code includes modules for data reduction, data simulation, time filtering, data adjustment and graphical presentation of the results. DELPHI is implemented in FORTRAN on a DEC-10 computer. This volume gives a brief set of operations instructions, samples of the output obtained from hard copies of the display on a Tektronix terminal and finally a listing of the code

  19. Reaction Diffusion Voronoi Diagrams: From Sensors Data to Computing

    Directory of Open Access Journals (Sweden)

    Alejandro Vázquez-Otero

    2015-05-01

    Full Text Available In this paper, a new method to solve computational problems using reaction diffusion (RD systems is presented. The novelty relies on the use of a model configuration that tailors its spatiotemporal dynamics to develop Voronoi diagrams (VD as a part of the system’s natural evolution. The proposed framework is deployed in a solution of related robotic problems, where the generalized VD are used to identify topological places in a grid map of the environment that is created from sensor measurements. The ability of the RD-based computation to integrate external information, like a grid map representing the environment in the model computational grid, permits a direct integration of sensor data into the model dynamics. The experimental results indicate that this method exhibits significantly less sensitivity to noisy data than the standard algorithms for determining VD in a grid. In addition, previous drawbacks of the computational algorithms based on RD models, like the generation of volatile solutions by means of excitable waves, are now overcome by final stable states.

  20. Computation of asteroid proper elements on the Grid

    Directory of Open Access Journals (Sweden)

    Novaković B.

    2009-01-01

    Full Text Available A procedure of gridification of the computation of asteroid proper orbital elements is described. The need to speed up the time consuming computations and make them more efficient is justified by the large increase of observational data expected from the next generation all sky surveys. We give the basic notion of proper elements and of the contemporary theories and methods used to compute them for different populations of objects. Proper elements for nearly 70,000 asteroids are derived since the beginning of use of the Grid infrastructure for the purpose. The average time for the catalogs update is significantly shortened with respect to the time needed with stand-alone workstations. We also present basics of the Grid computing, the concepts of Grid middleware and its Workload management system. The practical steps we undertook to efficiently gridify our application are described in full detail. We present the results of a comprehensive testing of the performance of different Grid sites, and offer some practical conclusions based on the benchmark results and on our experience. Finally, we propose some possibilities for the future work.

  1. Computation of Asteroid Proper Elements on the Grid

    Directory of Open Access Journals (Sweden)

    Novaković, B.

    2009-12-01

    Full Text Available A procedure of gridification of the computation of asteroid proper orbital elements is described. The need to speed up the time consuming computations and make them more efficient is justified by the large increase of observational data expected from the next generation all sky surveys. We give the basic notion of proper elements and of the contemporary theories and methods used to compute them for different populations of objects. Proper elements for nearly 70,000 asteroids are derived since the beginning of use of the Grid infrastructure for the purpose. The average time for the catalogs update is significantly shortened with respect to the time needed with stand-alone workstations. We also present basics of the Grid computing, the concepts of Grid middleware and its Workload management system. The practical steps we undertook to efficiently gridify our application are described in full detail. We present the results of a comprehensive testing of the performance of different Grid sites, and offer some practical conclusions based on the benchmark results and on our experience. Finally, we propose some possibilities for the future work.

  2. Application of GPU to computational multiphase fluid dynamics

    International Nuclear Information System (INIS)

    Nagatake, T; Kunugi, T

    2010-01-01

    The MARS (Multi-interfaces Advection and Reconstruction Solver) [1] is one of the surface volume tracking methods for multi-phase flows. Nowadays, the performance of GPU (Graphics Processing Unit) is much higher than the CPU (Central Processing Unit). In this study, the GPU was applied to the MARS in order to accelerate the computation of multi-phase flows (GPU-MARS), and the performance of the GPU-MARS was discussed. From the performance of the interface tracking method for the analyses of one-directional advection problem, it is found that the computing time of GPU(single GTX280) was around 4 times faster than that of the CPU (Xeon 5040, 4 threads parallelized). From the performance of Poisson Solver by using the algorithm developed in this study, it is found that the performance of the GPU showed around 30 times faster than that of the CPU. Finally, it is confirmed that the GPU showed the large acceleration of the fluid flow computation (GPU-MARS) compared to the CPU. However, it is also found that the double-precision computation of the GPU must perform with very high precision.

  3. Development of scan analysis techniques employing a small computer. Final report, February 1, 1963--July 31, 1976

    International Nuclear Information System (INIS)

    Kuhl, D.E.

    1976-01-01

    During the thirteen year duration of this contract the goal has been to develop and apply computer based analysis of radionuclide scan data so as to make available improved diagnostic information based on a knowledge of localized quantitative estimates of radionuclide concentration. Results are summarized

  4. Condition monitoring through advanced sensor and computational technology : final report (January 2002 to May 2005).

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung-Taek (Korea Atomic Energy Research Institute, Daejon, Korea); Luk, Vincent K.

    2005-05-01

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties.

  5. Numerical computation of soliton dynamics for NLS equations in a driving potential

    Directory of Open Access Journals (Sweden)

    Marco Caliari

    2010-06-01

    Full Text Available We provide numerical computations for the soliton dynamics of the nonlinear Schrodinger equation with an external potential. After computing the ground state solution r of a related elliptic equation we show that, in the semi-classical regime, the center of mass of the solution with initial datum built upon r is driven by the solution to $ddot x=- abla V(x$. Finally, we provide examples and analyze the numerical errors in the two dimensional case when V is a harmonic potential.

  6. Report on the Audit of the Use of Mobile Computers -- Air Force

    Science.gov (United States)

    1991-02-15

    We are providing this final report on the Audit of the Use of Mobile Computers--Air Force for your review and comments. The audit was conducted from...November 1989 through August 1990. The audit was part of our review of the use of mobile computers throughout DoD. Our overall objectives were to...operate, and maintain. The audit showed that the Air Force needed to retain no more than 5 of the 18 TSS’s. This would save the Air Force $27.3 million (Enclosure 2).

  7. Structure problems in the analog computation; Problemes de structure dans le calcul analogique

    Energy Technology Data Exchange (ETDEWEB)

    Braffort, P.L. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1957-07-01

    The recent mathematical development showed the importance of elementary structures (algebraic, topological, etc.) in abeyance under the great domains of classical analysis. Such structures in analog computation are put in evidence and possible development of applied mathematics are discussed. It also studied the topological structures of the standard representation of analog schemes such as additional triangles, integrators, phase inverters and functions generators. The analog method gives only the function of the variable: time, as results of its computations. But the course of computation, for systems including reactive circuits, introduces order structures which are called 'chronological'. Finally, it showed that the approximation methods of ordinary numerical and digital computation present the same structure as these analog computation. The structure analysis permits fruitful comparisons between the several domains of applied mathematics and suggests new important domains of application for analog method. (M.P.)

  8. Structure problems in the analog computation; Problemes de structure dans le calcul analogique

    Energy Technology Data Exchange (ETDEWEB)

    Braffort, P L [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1957-07-01

    The recent mathematical development showed the importance of elementary structures (algebraic, topological, etc.) in abeyance under the great domains of classical analysis. Such structures in analog computation are put in evidence and possible development of applied mathematics are discussed. It also studied the topological structures of the standard representation of analog schemes such as additional triangles, integrators, phase inverters and functions generators. The analog method gives only the function of the variable: time, as results of its computations. But the course of computation, for systems including reactive circuits, introduces order structures which are called 'chronological'. Finally, it showed that the approximation methods of ordinary numerical and digital computation present the same structure as these analog computation. The structure analysis permits fruitful comparisons between the several domains of applied mathematics and suggests new important domains of application for analog method. (M.P.)

  9. 40 CFR 61.134 - Standard: Naphthalene processing, final coolers, and final-cooler cooling towers.

    Science.gov (United States)

    2010-07-01

    ... coolers, and final-cooler cooling towers. 61.134 Section 61.134 Protection of Environment ENVIRONMENTAL... Standard: Naphthalene processing, final coolers, and final-cooler cooling towers. (a) No (“zero”) emissions are allowed from naphthalene processing, final coolers and final-cooler cooling towers at coke by...

  10. Continuous-Variable Instantaneous Quantum Computing is Hard to Sample.

    Science.gov (United States)

    Douce, T; Markham, D; Kashefi, E; Diamanti, E; Coudreau, T; Milman, P; van Loock, P; Ferrini, G

    2017-02-17

    Instantaneous quantum computing is a subuniversal quantum complexity class, whose circuits have proven to be hard to simulate classically in the discrete-variable realm. We extend this proof to the continuous-variable (CV) domain by using squeezed states and homodyne detection, and by exploring the properties of postselected circuits. In order to treat postselection in CVs, we consider finitely resolved homodyne detectors, corresponding to a realistic scheme based on discrete probability distributions of the measurement outcomes. The unavoidable errors stemming from the use of finitely squeezed states are suppressed through a qubit-into-oscillator Gottesman-Kitaev-Preskill encoding of quantum information, which was previously shown to enable fault-tolerant CV quantum computation. Finally, we show that, in order to render postselected computational classes in CVs meaningful, a logarithmic scaling of the squeezing parameter with the circuit size is necessary, translating into a polynomial scaling of the input energy.

  11. When technology became language: the origins of the linguistic conception of computer programming, 1950-1960.

    Science.gov (United States)

    Nofre, David; Priestley, Mark; Alberts, Gerard

    2014-01-01

    Language is one of the central metaphors around which the discipline of computer science has been built. The language metaphor entered modern computing as part of a cybernetic discourse, but during the second half of the 1950s acquired a more abstract meaning, closely related to the formal languages of logic and linguistics. The article argues that this transformation was related to the appearance of the commercial computer in the mid-1950s. Managers of computing installations and specialists on computer programming in academic computer centers, confronted with an increasing variety of machines, called for the creation of "common" or "universal languages" to enable the migration of computer code from machine to machine. Finally, the article shows how the idea of a universal language was a decisive step in the emergence of programming languages, in the recognition of computer programming as a proper field of knowledge, and eventually in the way we think of the computer.

  12. The role of parents and related factors on adolescent computer use

    Directory of Open Access Journals (Sweden)

    Jennifer A. Epstein

    2012-02-01

    Full Text Available Background. Research suggested the importance of parents on their adolescents’ computer activity. Spending too much time on the computer for recreational purposes in particular has been found to be related to areas of public health concern in children/adolescents, including obesity and substance use. Design and Methods. The goal of the research was to determine the association between recreational computer use and potentially linked factors (parental monitoring, social influences to use computers including parents, age of first computer use, self-control, and particular internet activities. Participants (aged 13-17 years and residing in the United States were recruited via the Internet to complete an anonymous survey online using a survey tool. The target sample of 200 participants who completed the survey was achieved. The sample’s average age was 16 and was 63% girls. Results. A set of regressions with recreational computer use as dependent variables were run. Conclusions. Less parental monitoring, younger age at first computer use, listening or downloading music from the internet more frequently, using the internet for educational purposes less frequently, and parent’s use of the computer for pleasure was related to spending a greater percentage of time on non-school computer use. These findings suggest the importance of parental monitoring and parental computer use on their children’s own computer use, and the influence of some internet activities on adolescent computer use. Finally, programs aimed at parents to help them increase the age when their children start using computers and learn how to place limits on recreational computer use are needed.

  13. Development of scan analysis techniques employing a small computer. Final report, February 1, 1963--July 31, 1976

    Energy Technology Data Exchange (ETDEWEB)

    Kuhl, D.E.

    1976-08-05

    During the thirteen year duration of this contract the goal has been to develop and apply computer based analysis of radionuclide scan data so as to make available improved diagnostic information based on a knowledge of localized quantitative estimates of radionuclide concentration. Results are summarized. (CH)

  14. On Study of Building Smart Campus under Conditions of Cloud Computing and Internet of Things

    Science.gov (United States)

    Huang, Chao

    2017-12-01

    two new concepts in the information era are cloud computing and internet of things, although they are defined differently, they share close relationship. It is a new measure to realize leap-forward development of campus by virtue of cloud computing, internet of things and other internet technologies to build smart campus. This paper, centering on the construction of smart campus, analyzes and compares differences between network in traditional campus and that in smart campus, and makes proposals on how to build smart campus finally from the perspectives of cloud computing and internet of things.

  15. A Project to Computerize Performance Objectives and Criterion-Referenced Measures in Occupational Education for Research and Determination of Applicability to Handicapped Learners. Final Report.

    Science.gov (United States)

    Lee, Connie W.; Hinson, Tony M.

    This publication is the final report of a 21-month project designed to (1) expand and refine the computer capabilities of the Vocational-Technical Education Consortium of States (V-TECS) to ensure rapid data access for generating routine and special occupational data-based reports; (2) develop and implement a computer storage and retrieval system…

  16. Bayesian Computational Sensor Networks for Aircraft Structural Health Monitoring

    Science.gov (United States)

    2016-02-02

    Virginia 22203 Air Force Research Laboratory Air Force Materiel Command 1 Final Performance Report: AFOSR T.C. Henderson , V.J. Mathews, and D...AFRL-AFOSR-VA-TR-2016-0094 Bayesian Computational Sensor Networks for Aircraft Structural Health Monitoring. Thomas Henderson UNIVERSITY OF UTAH SALT...The people who worked on this project include: Thomas C. Henderson , John Mathews, Jingru Zhou, Daimei Zhij, Ahmad Zoubi, Sabita Nahata, Dan Adams

  17. Computer Vision Photogrammetry for Underwater Archaeological Site Recording in a Low-Visibility Environment

    Science.gov (United States)

    Van Damme, T.

    2015-04-01

    Computer Vision Photogrammetry allows archaeologists to accurately record underwater sites in three dimensions using simple twodimensional picture or video sequences, automatically processed in dedicated software. In this article, I share my experience in working with one such software package, namely PhotoScan, to record a Dutch shipwreck site. In order to demonstrate the method's reliability and flexibility, the site in question is reconstructed from simple GoPro footage, captured in low-visibility conditions. Based on the results of this case study, Computer Vision Photogrammetry compares very favourably to manual recording methods both in recording efficiency, and in the quality of the final results. In a final section, the significance of Computer Vision Photogrammetry is then assessed from a historical perspective, by placing the current research in the wider context of about half a century of successful use of Analytical and later Digital photogrammetry in the field of underwater archaeology. I conclude that while photogrammetry has been used in our discipline for several decades now, for various reasons the method was only ever used by a relatively small percentage of projects. This is likely to change in the near future since, compared to the `traditional' photogrammetry approaches employed in the past, today Computer Vision Photogrammetry is easier to use, more reliable and more affordable than ever before, while at the same time producing more accurate and more detailed three-dimensional results.

  18. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  19. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  20. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Held, Isaac [Princeton Univ., NJ (United States); Balaji, V. [Princeton Univ., NJ (United States); Fueglistaler, Stephan [Princeton Univ., NJ (United States)

    2016-09-19

    We have constructed and analyzed a series of idealized models of tropical convection interacting with large-scale circulations, with 25-50km resolution and with 1-2km cloud resolving resolution to set the stage for rigorous tests of convection closure schemes in high resolution global climate models. Much of the focus has been on the climatology of tropical cyclogenesis in rotating systems and the related problem of the spontaneous aggregation of convection in non-rotating systems. The PI (Held) will be delivering the honorary Bjerknes lecture at the Fall 2016 AGU meeting in December on this work. We have also provided new analyses of long-standing issues related to the interaction between convection and the large-scale circulation: Kelvin waves in the upper troposphere and lower stratosphere, water vapor transport into the stratosphere, and upper tropospheric temperature trends. The results of these analyses help to improve our understanding of processes, and provide tests for future high resolution global modeling. Our final goal of testing new convections schemes in next-generation global atmospheric models at GFDL has been left for future work due to the complexity of the idealized model results meant as tests for these models uncovered in this work and to computational resource limitations. 11 papers have been published with support from this grant, 2 are in review, and another major summary paper is in preparation.

  1. Planning Committee for a National Resource for Computation in Chemistry. Final report, October 1, 1974--June 30, 1977

    International Nuclear Information System (INIS)

    1978-11-01

    The Planning Committee for a National Resource for Computation in Chemistry (NRCC) was charged with the responsibility of formulating recommendations regarding organizational structure for an NRCC including the composition, size, and responsibilities of its policy board, the relationship of such a board to the operating structure of the NRCC, to federal funding agencies, and to user groups; desirable priorities, growth rates, and levels of operations for the first several years; and facilities, access and site requirements for such a Resource. By means of site visits, questionnaires, and a workshop, the Committee sought advice from a wide range of potential users and organizations interested in chemical computation. Chemical kinetics, crystallography, macromolecular science, nonnumerical methods, physical organic chemistry, quantum chemistry, and statistical mechanics are covered

  2. Planning Committee for a National Resource for Computation in Chemistry. Final report, October 1, 1974--June 30, 1977

    Energy Technology Data Exchange (ETDEWEB)

    Bigeleisen, Jacob; Berne, Bruce J.; Coton, F. Albert; Scheraga, Harold A.; Simmons, Howard E.; Snyder, Lawrence C.; Wiberg, Kenneth B.; Wipke, W. Todd

    1978-11-01

    The Planning Committee for a National Resource for Computation in Chemistry (NRCC) was charged with the responsibility of formulating recommendations regarding organizational structure for an NRCC including the composition, size, and responsibilities of its policy board, the relationship of such a board to the operating structure of the NRCC, to federal funding agencies, and to user groups; desirable priorities, growth rates, and levels of operations for the first several years; and facilities, access and site requirements for such a Resource. By means of site visits, questionnaires, and a workshop, the Committee sought advice from a wide range of potential users and organizations interested in chemical computation. Chemical kinetics, crystallography, macromolecular science, nonnumerical methods, physical organic chemistry, quantum chemistry, and statistical mechanics are covered.

  3. Digitized adiabatic quantum computing with a superconducting circuit.

    Science.gov (United States)

    Barends, R; Shabani, A; Lamata, L; Kelly, J; Mezzacapo, A; Las Heras, U; Babbush, R; Fowler, A G; Campbell, B; Chen, Yu; Chen, Z; Chiaro, B; Dunsworth, A; Jeffrey, E; Lucero, E; Megrant, A; Mutus, J Y; Neeley, M; Neill, C; O'Malley, P J J; Quintana, C; Roushan, P; Sank, D; Vainsencher, A; Wenner, J; White, T C; Solano, E; Neven, H; Martinis, John M

    2016-06-09

    Quantum mechanics can help to solve complex problems in physics and chemistry, provided they can be programmed in a physical device. In adiabatic quantum computing, a system is slowly evolved from the ground state of a simple initial Hamiltonian to a final Hamiltonian that encodes a computational problem. The appeal of this approach lies in the combination of simplicity and generality; in principle, any problem can be encoded. In practice, applications are restricted by limited connectivity, available interactions and noise. A complementary approach is digital quantum computing, which enables the construction of arbitrary interactions and is compatible with error correction, but uses quantum circuit algorithms that are problem-specific. Here we combine the advantages of both approaches by implementing digitized adiabatic quantum computing in a superconducting system. We tomographically probe the system during the digitized evolution and explore the scaling of errors with system size. We then let the full system find the solution to random instances of the one-dimensional Ising problem as well as problem Hamiltonians that involve more complex interactions. This digital quantum simulation of the adiabatic algorithm consists of up to nine qubits and up to 1,000 quantum logic gates. The demonstration of digitized adiabatic quantum computing in the solid state opens a path to synthesizing long-range correlations and solving complex computational problems. When combined with fault-tolerance, our approach becomes a general-purpose algorithm that is scalable.

  4. Challenge for knowledge information processing systems (preliminary report on Fifth Generation Computer Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Moto-oka, T

    1982-01-01

    The author explains the reasons, aims and strategies for the Fifth Generation Computer Project in Japan. The project aims to introduce a radical new breed of computer by 1990. This article outlines the economic and social reasons for the project. It describes the impacts and effects that these computers are expected to have. The areas of technology which will form the contents of the research and development are highlighted. These are areas such as VLSI technology, speech and image understanding systems, artificial intelligence and advanced architecture design. Finally a schedule for completion of research is given which aims for a completed project by 1990.

  5. Bilaterally Weighted Patches for Disparity Map Computation

    Directory of Open Access Journals (Sweden)

    Laura Fernández Julià

    2015-03-01

    Full Text Available Visual correspondence is the key for 3D reconstruction in binocular stereovision. Local methods perform block-matching to compute the disparity, or apparent motion, of pixels between images. The simplest approach computes the distance of patches, usually square windows, and assumes that all pixels in the patch have the same disparity. A prominent artifact of the method is the "foreground fattening effet" near depth discontinuities. In order to find a more appropriate support, Yoon and Kweon introduced the use of weights based on color similarity and spatial distance, analogous to those used in the bilateral filter. This paper presents the theory of this method and the implementation we have developed. Moreover, some variants are discussed and improvements are used in the final implementation. Several examples and tests are presented and the parameters and performance of the method are analyzed.

  6. Feasibility of Computer-Based Videogame Therapy for Children with Cerebral Palsy.

    Science.gov (United States)

    Radtka, Sandra; Hone, Robert; Brown, Charles; Mastick, Judy; Melnick, Marsha E; Dowling, Glenna A

    2013-08-01

    Standing and gait balance problems are common in children with cerebral palsy (CP), resulting in falls and injuries. Task-oriented exercises to strengthen and stretch muscles that shift the center of mass and change the base of support are effective in improving balance. Gaming environments can be challenging and fun, encouraging children to engage in exercises at home. The aims of this project were to demonstrate the technical feasibility, ease of use, appeal, and safety of a computer-based videogame program designed to improve balance in children with CP. This study represents a close collaboration between computer design and clinical team members. The first two phases were performed in the laboratory, and the final phase was done in subjects' homes. The prototype balance game was developed using computer-based real-time three-dimensional programming that enabled the team to capture engineering data necessary to tune the system. Videogame modifications, including identifying compensatory movements, were made in an iterative fashion based on feedback from subjects and observations of clinical and software team members. Subjects ( n =14) scored the game 21.5 out of 30 for ease of use and appeal, 4.0 out of 5 for enjoyment, and 3.5 on comprehension. There were no safety issues, and the games performed without technical flaws in final testing. A computer-based videogame incorporating therapeutic movements to improve gait and balance in children with CP was appealing and feasible for home use. A follow-up study examining its effectiveness in improving balance in children with CP is recommended.

  7. Shutdown and degradation: Space computers for nuclear application, verification of radiation hardness. Final report

    International Nuclear Information System (INIS)

    Eichhorn, E.; Gerber, V.; Schreyer, P.

    1995-01-01

    (1) Employment of those radiation hard electronics which are already known in military and space applications. (2) The experience in space-flight shall be used to investigate nuclear technology areas, for example, by using space electronics to prove the range of applications in nuclear radiating environments. (3) Reproduction of a computer developed for telecommunication satellites; proof of radiation hardness by radiation tests. (4) At 328 Krad (Si) first failure of radiation tolerant devices with 100 Krad (Si) hardness guaranteed. (5) Using radiation hard devices of the same type you can expect applications at doses of greater than 1 Mrad (Si). Electronic systems applicable for radiation categories D, C and lower part of B for manipulators, vehicles, underwater robotics. (orig.) [de

  8. Computing effective properties of random heterogeneous materials on heterogeneous parallel processors

    Science.gov (United States)

    Leidi, Tiziano; Scocchi, Giulio; Grossi, Loris; Pusterla, Simone; D'Angelo, Claudio; Thiran, Jean-Philippe; Ortona, Alberto

    2012-11-01

    In recent decades, finite element (FE) techniques have been extensively used for predicting effective properties of random heterogeneous materials. In the case of very complex microstructures, the choice of numerical methods for the solution of this problem can offer some advantages over classical analytical approaches, and it allows the use of digital images obtained from real material samples (e.g., using computed tomography). On the other hand, having a large number of elements is often necessary for properly describing complex microstructures, ultimately leading to extremely time-consuming computations and high memory requirements. With the final objective of reducing these limitations, we improved an existing freely available FE code for the computation of effective conductivity (electrical and thermal) of microstructure digital models. To allow execution on hardware combining multi-core CPUs and a GPU, we first translated the original algorithm from Fortran to C, and we subdivided it into software components. Then, we enhanced the C version of the algorithm for parallel processing with heterogeneous processors. With the goal of maximizing the obtained performances and limiting resource consumption, we utilized a software architecture based on stream processing, event-driven scheduling, and dynamic load balancing. The parallel processing version of the algorithm has been validated using a simple microstructure consisting of a single sphere located at the centre of a cubic box, yielding consistent results. Finally, the code was used for the calculation of the effective thermal conductivity of a digital model of a real sample (a ceramic foam obtained using X-ray computed tomography). On a computer equipped with dual hexa-core Intel Xeon X5670 processors and an NVIDIA Tesla C2050, the parallel application version features near to linear speed-up progression when using only the CPU cores. It executes more than 20 times faster when additionally using the GPU.

  9. Biomass Gasifier for Computer Simulation; Biomassa foergasare foer Computer Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hansson, Jens; Leveau, Andreas; Hulteberg, Christian [Nordlight AB, Limhamn (Sweden)

    2011-08-15

    This report is an effort to summarize the existing data on biomass gasifiers as the authors have taken part in various projects aiming at computer simulations of systems that include biomass gasification. Reliable input data is paramount for any computer simulation, but so far there is no easy-accessible biomass gasifier database available for this purpose. This study aims at benchmarking current and past gasifier systems in order to create a comprehensive database for computer simulation purposes. The result of the investigation is presented in a Microsoft Excel sheet, so that the user easily can implement the data in their specific model. In addition to provide simulation data, the technology is described briefly for every studied gasifier system. The primary pieces of information that are sought for are temperatures, pressures, stream compositions and energy consumption. At present the resulting database contains 17 gasifiers, with one or more gasifier within the different gasification technology types normally discussed in this context: 1. Fixed bed 2. Fluidised bed 3. Entrained flow. It also contains gasifiers in the range from 100 kW to 120 MW, with several gasifiers in between these two values. Finally, there are gasifiers representing both direct and indirect heating. This allows for a more qualified and better available choice of starting data sets for simulations. In addition to this, with multiple data sets available for several of the operating modes, sensitivity analysis of various inputs will improve simulations performed. However, there have been fewer answers to the survey than expected/hoped for, which could have improved the database further. However, the use of online sources and other public information has to some extent counterbalanced the low response frequency of the survey. In addition to that, the database is preferred to be a living document, continuously updated with new gasifiers and improved information on existing gasifiers.

  10. Geometric phases and quantum computation

    International Nuclear Information System (INIS)

    Vedral, V.

    2005-01-01

    Full text: In my lectures I will talk about the notion of the geometric phase and explain its relevance for both fundamental quantum mechanics as well as quantum computation. The phase will be at first introduced via the idea of Pancharatnam which involves interference of three or more light beams. This notion will then be generalized to the evolving quantum systems. I will discuss both pure and mixed states as well as unitary and non-unitary evolutions. I will also show how the concept of the vacuum induced geometric phase arises in quantum optics. A simple measurement scheme involving a Mach Zehnder interferometer will be presented and will be used to illustrate all the concepts in the lecture. Finally, I will expose a simple generalization of the geometric phase to evolving degenerate states. This will be seen to lead to the possibility of universal quantum computation using geometric effects only. Moreover, this contains a promise of intrinsically fault tolerant quantum information processing, whose prospects will be outlined at the end of the lecture. (author)

  11. Testing system analysis as form of monitoring on course “Computer informational technologies” (on base of Distant Learning System “Kherson Virtual University”.

    Directory of Open Access Journals (Sweden)

    G. V. Patsukova

    2008-06-01

    Full Text Available In the article the usage of computer testing technologies for current and final examinations in the course “Computer information technologies ” for students of 1st course is considered.

  12. A computer code package for electron transport Monte Carlo simulation

    International Nuclear Information System (INIS)

    Popescu, Lucretiu M.

    1999-01-01

    A computer code package was developed for solving various electron transport problems by Monte Carlo simulation. It is based on condensed history Monte Carlo algorithm. In order to get reliable results over wide ranges of electron energies and target atomic numbers, specific techniques of electron transport were implemented such as: Moliere multiscatter angular distributions, Blunck-Leisegang multiscatter energy distribution, sampling of electron-electron and Bremsstrahlung individual interactions. Path-length and lateral displacement corrections algorithms and the module for computing collision, radiative and total restricted stopping powers and ranges of electrons are also included. Comparisons of simulation results with experimental measurements are finally presented. (author)

  13. Usability test of the ImPRO, computer-based procedure system

    International Nuclear Information System (INIS)

    Jung, Y.; Lee, J.

    2006-01-01

    ImPRO is a computer based procedure in both flowchart and success logic tree. It is evaluated on the basis of computer based procedure guidelines. It satisfies most requirements such as presentations and functionalities. Besides, SGTR has been performed with ImPRO to evaluate reading comprehension and situation awareness. ImPRO is a software engine which can interpret procedure script language, so that ImPRO is reliable by nature and verified with formal method. One bug, however, had hidden one year after release, but it was fixed. Finally backup paper procedures can be prepared on the same format as VDU in case of ImPRO failure. (authors)

  14. PC as physics computer for LHC?

    CERN Document Server

    Jarp, S; Simmins, A; Yaari, R; Jarp, Sverre; Tang, Hong; Simmins, Antony; Yaari, Refael

    1995-01-01

    In the last five years, we have seen RISC workstations take over the computing scene that was once controlled by mainframes and supercomputers. In this paper we will argue that the same phenomenon might happen again. A project, active since March this year in the Physics Data Processing group of CERN's CN division is described where ordinary desktop PCs running Windows (NT and 3.11) have been used for creating an environment for running large LHC batch jobs (initially the DICE simulation job of Atlas). The problems encountered in porting both the CERN library and the specific Atlas codes are described together with some encouraging benchmark results when comparing to existing RISC workstations in use by the Atlas collaboration. The issues of establishing the batch environment (Batch monitor, staging software, etc.) are also covered. Finally a quick extrapolation of commodity computing power available in the future is touched upon to indicate what kind of cost envelope could be sufficient for the simulation fa...

  15. Research on the application in disaster reduction for using cloud computing technology

    Science.gov (United States)

    Tao, Liang; Fan, Yida; Wang, Xingling

    Cloud Computing technology has been rapidly applied in different domains recently, promotes the progress of the domain's informatization. Based on the analysis of the state of application requirement in disaster reduction and combining the characteristics of Cloud Computing technology, we present the research on the application of Cloud Computing technology in disaster reduction. First of all, we give the architecture of disaster reduction cloud, which consists of disaster reduction infrastructure as a service (IAAS), disaster reduction cloud application platform as a service (PAAS) and disaster reduction software as a service (SAAS). Secondly, we talk about the standard system of disaster reduction in five aspects. Thirdly, we indicate the security system of disaster reduction cloud. Finally, we draw a conclusion the use of cloud computing technology will help us to solve the problems for disaster reduction and promote the development of disaster reduction.

  16. Computer Navigation-aided Resection of Sacral Chordomas

    Directory of Open Access Journals (Sweden)

    Yong-Kun Yang

    2016-01-01

    Full Text Available Background: Resection of sacral chordomas is challenging. The anatomy is complex, and there are often no bony landmarks to guide the resection. Achieving adequate surgical margins is, therefore, difficult, and the recurrence rate is high. Use of computer navigation may allow optimal preoperative planning and improve precision in tumor resection. The purpose of this study was to evaluate the safety and feasibility of computer navigation-aided resection of sacral chordomas. Methods: Between 2007 and 2013, a total of 26 patients with sacral chordoma underwent computer navigation-aided surgery were included and followed for a minimum of 18 months. There were 21 primary cases and 5 recurrent cases, with a mean age of 55.8 years old (range: 35-84 years old. Tumors were located above the level of the S3 neural foramen in 23 patients and below the level of the S3 neural foramen in 3 patients. Three-dimensional images were reconstructed with a computed tomography-based navigation system combined with the magnetic resonance images using the navigation software. Tumors were resected via a posterior approach assisted by the computer navigation. Mean follow-up was 38.6 months (range: 18-84 months. Results: Mean operative time was 307 min. Mean intraoperative blood loss was 3065 ml. For computer navigation, the mean registration deviation during surgery was 1.7 mm. There were 18 wide resections, 4 marginal resections, and 4 intralesional resections. All patients were alive at the final follow-up, with 2 (7.7% exhibiting tumor recurrence. The other 24 patients were tumor-free. The mean Musculoskeletal Tumor Society Score was 27.3 (range: 19-30. Conclusions: Computer-assisted navigation can be safely applied to the resection of the sacral chordomas, allowing execution of preoperative plans, and achieving good oncological outcomes. Nevertheless, this needs to be accomplished by surgeons with adequate experience and skill.

  17. Privacy-Preserving Computation with Trusted Computing via Scramble-then-Compute

    OpenAIRE

    Dang Hung; Dinh Tien Tuan Anh; Chang Ee-Chien; Ooi Beng Chin

    2017-01-01

    We consider privacy-preserving computation of big data using trusted computing primitives with limited private memory. Simply ensuring that the data remains encrypted outside the trusted computing environment is insufficient to preserve data privacy, for data movement observed during computation could leak information. While it is possible to thwart such leakage using generic solution such as ORAM [42], designing efficient privacy-preserving algorithms is challenging. Besides computation effi...

  18. Computation of the target state and feedback controls for time optimal consensus in multi-agent systems

    Science.gov (United States)

    Mulla, Ameer K.; Patil, Deepak U.; Chakraborty, Debraj

    2018-02-01

    N identical agents with bounded inputs aim to reach a common target state (consensus) in the minimum possible time. Algorithms for computing this time-optimal consensus point, the control law to be used by each agent and the time taken for the consensus to occur, are proposed. Two types of multi-agent systems are considered, namely (1) coupled single-integrator agents on a plane and, (2) double-integrator agents on a line. At the initial time instant, each agent is assumed to have access to the state information of all the other agents. An algorithm, using convexity of attainable sets and Helly's theorem, is proposed, to compute the final consensus target state and the minimum time to achieve this consensus. Further, parts of the computation are parallelised amongst the agents such that each agent has to perform computations of O(N2) run time complexity. Finally, local feedback time-optimal control laws are synthesised to drive each agent to the target point in minimum time. During this part of the operation, the controller for each agent uses measurements of only its own states and does not need to communicate with any neighbouring agents.

  19. Status of computer codes available in AEOI for reactor physics analysis

    International Nuclear Information System (INIS)

    Karbassiafshar, M.

    1986-01-01

    Many of the nuclear computer codes available in Atomic Energy Organization of Iran AEOI can be used for physics analysis of an operating reactor or design purposes. Grasp of the various methods involved and practical experience with these codes would be the starting point for interesting design studies or analysis of operating conditions of presently existing and future reactors. A review of the objectives and flowchart of commonly practiced procedures in reactor physics analysis of LWRs and related computer codes was made, extrapolating to the nationally and internationally available resources. Finally, effective utilization of the existing facilities is discussed and called upon

  20. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  1. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  2. In-Network Computation is a Dumb Idea Whose Time Has Come

    KAUST Repository

    Sapio, Amedeo; Abdelaziz, Ibrahim; Aldilaijan, Abdulla; Canini, Marco; Kalnis, Panos

    2017-01-01

    Programmable data plane hardware creates new opportunities for infusing intelligence into the network. This raises a fundamental question: what kinds of computation should be delegated to the network? In this paper, we discuss the opportunities and challenges for co-designing data center distributed systems with their network layer. We believe that the time has finally come for offloading part of their computation to execute in-network. However, in-network computation tasks must be judiciously crafted to match the limitations of the network machine architecture of programmable devices. With the help of our experiments on machine learning and graph analytics workloads, we identify that aggregation functions raise opportunities to exploit the limited computation power of networking hardware to lessen network congestion and improve the overall application performance. Moreover, as a proof-of-concept, we propose DAIET, a system that performs in-network data aggregation. Experimental results with an initial prototype show a large data reduction ratio (86.9%-89.3%) and a similar decrease in the workers' computation time.

  3. In-Network Computation is a Dumb Idea Whose Time Has Come

    KAUST Repository

    Sapio, Amedeo

    2017-11-27

    Programmable data plane hardware creates new opportunities for infusing intelligence into the network. This raises a fundamental question: what kinds of computation should be delegated to the network? In this paper, we discuss the opportunities and challenges for co-designing data center distributed systems with their network layer. We believe that the time has finally come for offloading part of their computation to execute in-network. However, in-network computation tasks must be judiciously crafted to match the limitations of the network machine architecture of programmable devices. With the help of our experiments on machine learning and graph analytics workloads, we identify that aggregation functions raise opportunities to exploit the limited computation power of networking hardware to lessen network congestion and improve the overall application performance. Moreover, as a proof-of-concept, we propose DAIET, a system that performs in-network data aggregation. Experimental results with an initial prototype show a large data reduction ratio (86.9%-89.3%) and a similar decrease in the workers\\' computation time.

  4. GRAPH-BASED POST INCIDENT INTERNAL AUDIT METHOD OF COMPUTER EQUIPMENT

    Directory of Open Access Journals (Sweden)

    I. S. Pantiukhin

    2016-05-01

    Full Text Available Graph-based post incident internal audit method of computer equipment is proposed. The essence of the proposed solution consists in the establishing of relationships among hard disk damps (image, RAM and network. This method is intended for description of information security incident properties during the internal post incident audit of computer equipment. Hard disk damps receiving and formation process takes place at the first step. It is followed by separation of these damps into the set of components. The set of components includes a large set of attributes that forms the basis for the formation of the graph. Separated data is recorded into the non-relational database management system (NoSQL that is adapted for graph storage, fast access and processing. Damps linking application method is applied at the final step. The presented method gives the possibility to human expert in information security or computer forensics for more precise, informative internal audit of computer equipment. The proposed method allows reducing the time spent on internal audit of computer equipment, increasing accuracy and informativeness of such audit. The method has a development potential and can be applied along with the other components in the tasks of users’ identification and computer forensics.

  5. Computational modeling of neural plasticity for self-organization of neural networks.

    Science.gov (United States)

    Chrol-Cannon, Joseph; Jin, Yaochu

    2014-11-01

    Self-organization in biological nervous systems during the lifetime is known to largely occur through a process of plasticity that is dependent upon the spike-timing activity in connected neurons. In the field of computational neuroscience, much effort has been dedicated to building up computational models of neural plasticity to replicate experimental data. Most recently, increasing attention has been paid to understanding the role of neural plasticity in functional and structural neural self-organization, as well as its influence on the learning performance of neural networks for accomplishing machine learning tasks such as classification and regression. Although many ideas and hypothesis have been suggested, the relationship between the structure, dynamics and learning performance of neural networks remains elusive. The purpose of this article is to review the most important computational models for neural plasticity and discuss various ideas about neural plasticity's role. Finally, we suggest a few promising research directions, in particular those along the line that combines findings in computational neuroscience and systems biology, and their synergetic roles in understanding learning, memory and cognition, thereby bridging the gap between computational neuroscience, systems biology and computational intelligence. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. Delayed diagnosis of intermittent mesenteroaxial volvulus of the stomach by computed tomography: a case report

    Directory of Open Access Journals (Sweden)

    Woon Colin

    2008-11-01

    Full Text Available Abstract Introduction Gastric volvulus is a rare condition. Presenting acutely, mesenteroaxial gastric volvulus has characteristic symptoms and may be easily detected with upper gastrointestinal contrast studies. In contrast, subacute, intermittent cases present with intermittent vague symptoms from episodic twisting and untwisting. Imaging in these cases is only useful if performed in the symptomatic interval. Case presentation We describe a patient with a long history of intermittent chest and epigastric pain. An earlier barium meal was not diagnostic. Diagnosis was finally secured during the current admission by a combination of (1 serum investigations, (2 endoscopy, and finally (3 computed tomography. Conclusion Non-specific and misleading symptoms and signs may delay the diagnosis of intermittent, subacute volvulus. Imaging studies performed in the well interval may be non-diagnostic. Elevated creatine kinase and aldolase of a non-cardiac cause and endoscopic findings of ischaemic ulceration and difficulty in negotiating the pylorus may raise the suspicion of gastric volvulus. In this case, abdominal computed tomography with spatial reconstruction was crucial in securing the final diagnosis.

  7. 7X performance results - final report : ASCI Red vs Red Storm.

    Energy Technology Data Exchange (ETDEWEB)

    Dinge, Dennis C. (Cray Inc., Albuquerque, NM); Davis, Michael E. (Cray Inc., Albuquerque, NM); Haskell, Karen H.; Ballance, Robert A.; Gardiner, Thomas Anthony; Stevenson, Joel O.; Noe, John P.

    2011-04-01

    The goal of the 7X performance testing was to assure Sandia National Laboratories, Cray Inc., and the Department of Energy that Red Storm would achieve its performance requirements which were defined as a comparison between ASCI Red and Red Storm. Our approach was to identify one or more problems for each application in the 7X suite, run those problems at multiple processor sizes in the capability computing range, and compare the results between ASCI Red and Red Storm. The first part of this report describes the two computer systems, the applications in the 7X suite, the test problems, and the results of the performance tests on ASCI Red and Red Storm. During the course of the testing on Red Storm, we had the opportunity to run the test problems in both single-core mode and dual-core mode and the second part of this report describes those results. Finally, we reflect on lessons learned in undertaking a major head-to-head benchmark comparison.

  8. Direct extraction of boundaries from computed tomography scans

    International Nuclear Information System (INIS)

    Thirion, J.P.

    1994-01-01

    This paper presents a method, based on the Filtered Backprojection technique (FBP), to extract directly the boundaries of X-ray images, without previous image reconstruction. The authors preprocess the raw data in order to compute directly the reconstructed values of the gradient or of the Laplacian at any location in the plane (defined with real coordinates). The reconstructed value of the gradient and of the Laplacian correspond to the exact mathematical definition of the differentials of the image. For noisy data, the authors propose also to use an extension of existing FBP techniques, adapted to the computation of the gradient and of the Laplacian. Finally, the authors show how to use the corresponding operators to perform the segmentation of a slice, without image reconstruction. Images of the reconstructed gradient, Laplacian, and segmented objects are presented

  9. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  10. Symbolic computation and its application to high energy physics

    International Nuclear Information System (INIS)

    Hearn, A.C.

    1981-01-01

    It is clear that we are in the middle of an electronic revolution whose effect will be as profound as the industrial revolution. The continuing advances in computing technology will provide us with devices which will make present day computers appear primitive. In this environment, the algebraic and other non-mumerical capabilities of such devices will become increasingly important. These lectures will review the present state of the field of algebraic computation and its potential for problem solving in high energy physics and related areas. We shall begin with a brief description of the available systems and examine the data objects which they consider. As an example of the facilities which these systems can offer, we shall then consider the problem of analytic integration, since this is so fundamental to many of the calculational techniques used by high energy physicists. Finally, we shall study the implications which the current developments in hardware technology hold for scientific problem solving. (orig.)

  11. ATLAS Distributed Computing experience and performance during the LHC Run-2

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00081160; The ATLAS collaboration

    2017-01-01

    ATLAS Distributed Computing during LHC Run-1 was challenged by steadily increasing computing, storage and network requirements. In addition, the complexity of processing task workflows and their associated data management requirements led to a new paradigm in the ATLAS computing model for Run-2, accompanied by extensive evolution and redesign of the workflow and data management systems. The new systems were put into production at the end of 2014, and gained robustness and maturity during 2015 data taking. ProdSys2, the new request and task interface; JEDI, the dynamic job execution engine developed as an extension to PanDA; and Rucio, the new data management system, form the core of Run-2 ATLAS distributed computing engine. One of the big changes for Run-2 was the adoption of the Derivation Framework, which moves the chaotic CPU and data intensive part of the user analysis into the centrally organized train production, delivering derived AOD datasets to user groups for final analysis. The effectiveness of the...

  12. ATLAS Distributed Computing experience and performance during the LHC Run-2

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00081160; The ATLAS collaboration

    2016-01-01

    ATLAS Distributed Computing during LHC Run-1 was challenged by steadily increasing computing, storage and network requirements. In addition, the complexity of processing task workflows and their associated data management requirements led to a new paradigm in the ATLAS computing model for Run-2, accompanied by extensive evolution and redesign of the workflow and data management systems. The new systems were put into production at the end of 2014, and gained robustness and maturity during 2015 data taking. ProdSys2, the new request and task interface; JEDI, the dynamic job execution engine developed as an extension to PanDA; and Rucio, the new data management system, form the core of the Run-2 ATLAS distributed computing engine. One of the big changes for Run-2 was the adoption of the Derivation Framework, which moves the chaotic CPU and data intensive part of the user analysis into the centrally organized train production, delivering derived AOD datasets to user groups for final analysis. The effectiveness of...

  13. CFD modelling of axial mixing in the intermediate and final rinses of cleaning-in-place procedures of straight pipes

    DEFF Research Database (Denmark)

    Yang, Jifeng; Jensen, Bo Boye Busk; Nordkvist, Mikkel

    2018-01-01

    The intermediate and final rinses of straight pipes, in which water replaces a cleaning agent of similar density and viscosity, are modelled using Computational Fluid Dynamic (CFD) methods. It is anticipated that the displacement process is achieved by convective and diffusive transport. The simu...

  14. Analyses, algorithms, and computations for models of high-temperature superconductivity. Final technical report

    International Nuclear Information System (INIS)

    Gunzburger, M.D.; Peterson, J.S.

    1998-01-01

    Under the sponsorship of the Department of Energy, the authors have achieved significant progress in the modeling, analysis, and computation of superconducting phenomena. Their work has focused on mezoscale models as typified by the celebrated ginzburg-Landau equations; these models are intermediate between the microscopic models (that can be used to understand the basic structure of superconductors and of the atomic and sub-atomic behavior of these materials) and the macroscale, or homogenized, models (that can be of use for the design of devices). The models the authors have considered include a time dependent Ginzburg-Landau model, a variable thickness thin film model, models for high values of the Ginzburg-Landau parameter, models that account for normal inclusions and fluctuations and Josephson effects, and the anisotropic Ginzburg-Landau and Lawrence-Doniach models for layered superconductors, including those with high critical temperatures. In each case, they have developed or refined the models, derived rigorous mathematical results that enhance the state of understanding of the models and their solutions, and developed, analyzed, and implemented finite element algorithms for the approximate solution of the model equations

  15. Evaluation of Musculoskeletal Disorders among computer Users in Isfahan

    Directory of Open Access Journals (Sweden)

    Ayoub Ghanbary

    2015-08-01

    Full Text Available Along with widespread use of computers, work-related musculoskeletal disorders (MSDs have become the most prevalent ergonomic problems in computer users. With evaluating musculoskeletal disorders among Computer Users can intervent a action to reduce musculoskeletal disorders carried out. The aim of the present study was to Assessment of Musculoskeletal Disorders among Computer Users in Isfahan University with Rapid Office Strain Assessment (ROSA method and Nordic questionnaire. This cross-sectional study was conducted on 96 computer users in Isfahan university. The data were analyzed using correlation and line regression by test spss 20. and descriptive statistics and Anova test. Data collection tool was Nordic questionnaire and Rapid Office Strain Assessment method checklist. The results of Nordic questionnaire showed that prevalence of musculoskeletal disorders in computer users were in the shoulder (62.1%, neck (54.9% and back (53.1% respectively more than in other parts of the body. Based on the level of risk of ROSA were 19 individuals in an area of low risk, 50 individual area of notification and 27 individual in the area hazard and need for ergonomics interventions. Musculoskeletal disorders prevalence were in women more than men. Also Anova test showed that there is a direct and significant correlation between age and work experience with a final score ROSA (p<0.001. The study result showed that the prevalence of MSDs among computer users of Isfahan universities is pretty high and must ergonomic interventions such as computer workstation redesign, users educate about ergonomic principles computer with work, reduced working hours in computers with work, and elbows should be kept close to the body with the angle between 90 and 120 degrees to reduce musculoskeletal disorders carried out.

  16. The Particle Beam Optics Interactive Computer Laboratory

    International Nuclear Information System (INIS)

    Gillespie, G.H.; Hill, B.W.; Brown, N.A.; Babcock, R.C.; Martono, H.; Carey, D.C.

    1997-01-01

    The Particle Beam Optics Interactive Computer Laboratory (PBO Lab) is an educational software concept to aid students and professionals in learning about charged particle beams and particle beam optical systems. The PBO Lab is being developed as a cross-platform application and includes four key elements. The first is a graphic user interface shell that provides for a highly interactive learning session. The second is a knowledge database containing information on electric and magnetic optics transport elements. The knowledge database provides interactive tutorials on the fundamental physics of charged particle optics and on the technology used in particle optics hardware. The third element is a graphical construction kit that provides tools for students to interactively and visually construct optical beamlines. The final element is a set of charged particle optics computational engines that compute trajectories, transport beam envelopes, fit parameters to optical constraints and carry out similar calculations for the student designed beamlines. The primary computational engine is provided by the third-order TRANSPORT code. Augmenting TRANSPORT is the multiple ray tracing program TURTLE and a first-order matrix program that includes a space charge model and support for calculating single particle trajectories in the presence of the beam space charge. This paper describes progress on the development of the PBO Lab. copyright 1997 American Institute of Physics

  17. Schedulability Analysis for Java Finalizers

    DEFF Research Database (Denmark)

    Bøgholm, Thomas; Hansen, Rene Rydhof; Søndergaard, Hans

    2010-01-01

    Java finalizers perform clean-up and finalisation of objects at garbage collection time. In real-time Java profiles the use of finalizers is either discouraged (RTSJ, Ravenscar Java) or even disallowed (JSR-302), mainly because of the unpredictability of finalizers and in particular their impact...... on the schedulability analysis. In this paper we show that a controlled scoped memory model results in a structured and predictable execution of finalizers, more reminiscent of C++ destructors than Java finalizers. Furthermore, we incorporate finalizers into a (conservative) schedulability analysis for Predictable Java...... programs. Finally, we extend the SARTS tool for automated schedulability analysis of Java bytecode programs to handle finalizers in a fully automated way....

  18. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    OpenAIRE

    Karlheinz Schwarz; Rainer Breitling; Christian Allen

    2013-01-01

    Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...

  19. Computational logic with square rings of nanomagnets

    Science.gov (United States)

    Arava, Hanu; Derlet, Peter M.; Vijayakumar, Jaianth; Cui, Jizhai; Bingham, Nicholas S.; Kleibert, Armin; Heyderman, Laura J.

    2018-06-01

    Nanomagnets are a promising low-power alternative to traditional computing. However, the successful implementation of nanomagnets in logic gates has been hindered so far by a lack of reliability. Here, we present a novel design with dipolar-coupled nanomagnets arranged on a square lattice to (i) support transfer of information and (ii) perform logic operations. We introduce a thermal protocol, using thermally active nanomagnets as a means to perform computation. Within this scheme, the nanomagnets are initialized by a global magnetic field and thermally relax on raising the temperature with a resistive heater. We demonstrate error-free transfer of information in chains of up to 19 square rings and we show a high level of reliability with successful gate operations of ∼94% across more than 2000 logic gates. Finally, we present a functionally complete prototype NAND/NOR logic gate that could be implemented for advanced logic operations. Here we support our experiments with simulations of the thermally averaged output and determine the optimal gate parameters. Our approach provides a new pathway to a long standing problem concerning reliability in the use of nanomagnets for computation.

  20. Scheduling multimedia services in cloud computing environment

    Science.gov (United States)

    Liu, Yunchang; Li, Chunlin; Luo, Youlong; Shao, Yanling; Zhang, Jing

    2018-02-01

    Currently, security is a critical factor for multimedia services running in the cloud computing environment. As an effective mechanism, trust can improve security level and mitigate attacks within cloud computing environments. Unfortunately, existing scheduling strategy for multimedia service in the cloud computing environment do not integrate trust mechanism when making scheduling decisions. In this paper, we propose a scheduling scheme for multimedia services in multi clouds. At first, a novel scheduling architecture is presented. Then, We build a trust model including both subjective trust and objective trust to evaluate the trust degree of multimedia service providers. By employing Bayesian theory, the subjective trust degree between multimedia service providers and users is obtained. According to the attributes of QoS, the objective trust degree of multimedia service providers is calculated. Finally, a scheduling algorithm integrating trust of entities is proposed by considering the deadline, cost and trust requirements of multimedia services. The scheduling algorithm heuristically hunts for reasonable resource allocations and satisfies the requirement of trust and meets deadlines for the multimedia services. Detailed simulated experiments demonstrate the effectiveness and feasibility of the proposed trust scheduling scheme.

  1. Minicomputer and computations in chemistry

    Energy Technology Data Exchange (ETDEWEB)

    1978-01-01

    The introduction of multiple-precision hardware and longer word lengths has given the minicomputer a much more general potential for chemistry applications. It was the purpose of this workshop to address this potential, particularly as it is related to computations. The workshop brought together persons with minicomputer experience and those who are considering how the minicomputer might enhance their research activities. The workshop sessions were arranged in sequence to address the following questions: (1) Is the general purpose minicomputer an appropriate tool to meet the computational requirements of a chemistry research laboratory. (2) What are the procedures for wisely designing a minicomputer configuration. (3) What special-purpose hardware is available to enhance the speed of a minicomputer. (4) How does one select the appropriate minicomputer and ensure that it can accomplish the tasks for which is was designed. (5) How can one network minicomputers for more efficient and flexible operation. (6) Can one do really large-scale computations on a minicomputer and what modifications are necessary to convert existing programs and algorithms. (7) How can the minicomputer be used to access the maxicomputers at the NRCC. (8) How are computers likely to evolve in the future. (9) What should be the role of the NRCC in relation to minicomputers. This report of the workshop consists mainly of edited transcripts of introductory remarks. These were augmented by relevant bibliographies as an alternative to transcription of the entire workshop. There was no attempt in the workshop to give final answers to the questions that were raised, since the answers are determined in large part by each particular minicomputer environment.

  2. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 47: The value of computer networks in aerospace

    Science.gov (United States)

    Bishop, Ann Peterson; Pinelli, Thomas E.

    1995-01-01

    This paper presents data on the value of computer networks that were obtained from a national survey of 2000 aerospace engineers that was conducted in 1993. Survey respondents reported the extent to which they used computer networks in their work and communication and offered their assessments of the value of various network types and applications. They also provided information about the positive impacts of networks on their work, which presents another perspective on value. Finally, aerospace engineers' recommendations on network implementation present suggestions for increasing the value of computer networks within aerospace organizations.

  3. A case study on support for students' thinking through computer-mediated communication.

    Science.gov (United States)

    Sannomiya, M; Kawaguchi, A

    2000-08-01

    This is a case study on support for thinking through computer-mediated communication. Two graduate students were supervised in their research using computer-mediated communication, which was asynchronous and written; the supervisor was not present. The students' reports pointed out there was more planning and editing and low interactivity in this approach relative to face-to-face communication. These attributes were confirmed by their supervisor's report. The students also suggested that the latter was effective in support of a production stage of thinking in research, while the former approach was effective in support of examination of thinking. For distance education to be successful, an appropriate combination of communication media must consider students' thinking stages. Finally, transient and permanent effects should be discriminated in computer-mediated communication.

  4. Gradient Learning Algorithms for Ontology Computing

    Science.gov (United States)

    Gao, Wei; Zhu, Linli

    2014-01-01

    The gradient learning model has been raising great attention in view of its promising perspectives for applications in statistics, data dimensionality reducing, and other specific fields. In this paper, we raise a new gradient learning model for ontology similarity measuring and ontology mapping in multidividing setting. The sample error in this setting is given by virtue of the hypothesis space and the trick of ontology dividing operator. Finally, two experiments presented on plant and humanoid robotics field verify the efficiency of the new computation model for ontology similarity measure and ontology mapping applications in multidividing setting. PMID:25530752

  5. Gradient Learning Algorithms for Ontology Computing

    Directory of Open Access Journals (Sweden)

    Wei Gao

    2014-01-01

    Full Text Available The gradient learning model has been raising great attention in view of its promising perspectives for applications in statistics, data dimensionality reducing, and other specific fields. In this paper, we raise a new gradient learning model for ontology similarity measuring and ontology mapping in multidividing setting. The sample error in this setting is given by virtue of the hypothesis space and the trick of ontology dividing operator. Finally, two experiments presented on plant and humanoid robotics field verify the efficiency of the new computation model for ontology similarity measure and ontology mapping applications in multidividing setting.

  6. Separation of electron ion ring components (computational simulation and experimental results)

    International Nuclear Information System (INIS)

    Aleksandrov, V.S.; Dolbilov, G.V.; Kazarinov, N.Yu.; Mironov, V.I.; Novikov, V.G.; Perel'shtejn, Eh.A.; Sarantsev, V.P.; Shevtsov, V.F.

    1978-01-01

    The problems of the available polarization value of electron-ion rings in the regime of acceleration and separation of its components at the final stage of acceleration are studied. The results of computational simulation by use of the macroparticle method and experiments on the ring acceleration and separation are given. The comparison of calculation results with experiment is presented

  7. Computation for LHC experiments: a worldwide computing grid; Le calcul scientifique des experiences LHC: une grille de production mondiale

    Energy Technology Data Exchange (ETDEWEB)

    Fairouz, Malek [Universite Joseph-Fourier, LPSC, CNRS-IN2P3, Grenoble I, 38 (France)

    2010-08-15

    In normal operating conditions the LHC detectors are expected to record about 10{sup 10} collisions each year. The processing of all the consequent experimental data is a real computing challenge in terms of equipment, software and organization: it requires sustaining data flows of a few 10{sup 9} octets per second and recording capacity of a few tens of 10{sup 15} octets each year. In order to meet this challenge a computing network implying the dispatch and share of tasks, has been set. The W-LCG grid (World wide LHC computing grid) is made up of 4 tiers. Tiers 0 is the computer center in CERN, it is responsible for collecting and recording the raw data from the LHC detectors and to dispatch it to the 11 tiers 1. The tiers 1 is typically a national center, it is responsible for making a copy of the raw data and for processing it in order to recover relevant data with a physical meaning and to transfer the results to the 150 tiers 2. The tiers 2 is at the level of the Institute or laboratory, it is in charge of the final analysis of the data and of the production of the simulations. Tiers 3 are at the level of the laboratories, they provide a complementary and local resource to tiers 2 in terms of data analysis. (A.C.)

  8. Nonlinear simulations with and computational issues for NIMROD

    International Nuclear Information System (INIS)

    Sovinec, C.R.

    1998-01-01

    The NIMROD (Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion) code development project was commissioned by the US Department of Energy in February, 1996 to provide the fusion research community with a computational tool for studying low-frequency behavior in experiments. Specific problems of interest include the neoclassical evolution of magnetic islands and the nonlinear behavior of tearing modes in the presence of rotation and nonideal walls in tokamaks; they also include topics relevant to innovative confinement concepts such as magnetic turbulence. Besides having physics models appropriate for these phenomena, an additional requirement is the ability to perform the computations in realistic geometries. The NIMROD Team is using contemporary management and computational methods to develop a computational tool for investigating low-frequency behavior in plasma fusion experiments. The authors intend to make the code freely available, and are taking steps to make it as easy to learn and use as possible. An example application for NIMROD is the nonlinear toroidal RFP simulation--the first in a series to investigate how toroidal geometry affects MHD activity in RFPs. Finally, the most important issue facing the project is execution time, and they are exploring better matrix solvers and a better parallel decomposition to address this

  9. Nonlinear simulations with and computational issues for NIMROD

    Energy Technology Data Exchange (ETDEWEB)

    Sovinec, C.R. [Los Alamos National Lab., NM (United States)

    1998-12-31

    The NIMROD (Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion) code development project was commissioned by the US Department of Energy in February, 1996 to provide the fusion research community with a computational tool for studying low-frequency behavior in experiments. Specific problems of interest include the neoclassical evolution of magnetic islands and the nonlinear behavior of tearing modes in the presence of rotation and nonideal walls in tokamaks; they also include topics relevant to innovative confinement concepts such as magnetic turbulence. Besides having physics models appropriate for these phenomena, an additional requirement is the ability to perform the computations in realistic geometries. The NIMROD Team is using contemporary management and computational methods to develop a computational tool for investigating low-frequency behavior in plasma fusion experiments. The authors intend to make the code freely available, and are taking steps to make it as easy to learn and use as possible. An example application for NIMROD is the nonlinear toroidal RFP simulation--the first in a series to investigate how toroidal geometry affects MHD activity in RFPs. Finally, the most important issue facing the project is execution time, and they are exploring better matrix solvers and a better parallel decomposition to address this.

  10. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    Science.gov (United States)

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  11. DIMEC - Final Report

    DEFF Research Database (Denmark)

    Conrad, Finn

    1997-01-01

    Final report of the research project DIMEC - Danish InfoMechatronic Control supported by the Danish Technical Research Council, STVF.......Final report of the research project DIMEC - Danish InfoMechatronic Control supported by the Danish Technical Research Council, STVF....

  12. A design of a computer complex including vector processors

    International Nuclear Information System (INIS)

    Asai, Kiyoshi

    1982-12-01

    We, members of the Computing Center, Japan Atomic Energy Research Institute have been engaged for these six years in the research of adaptability of vector processing to large-scale nuclear codes. The research has been done in collaboration with researchers and engineers of JAERI and a computer manufacturer. In this research, forty large-scale nuclear codes were investigated from the viewpoint of vectorization. Among them, twenty-six codes were actually vectorized and executed. As the results of the investigation, it is now estimated that about seventy percents of nuclear codes and seventy percents of our total amount of CPU time of JAERI are highly vectorizable. Based on the data obtained by the investigation, (1)currently vectorizable CPU time, (2)necessary number of vector processors, (3)necessary manpower for vectorization of nuclear codes, (4)computing speed, memory size, number of parallel 1/0 paths, size and speed of 1/0 buffer of vector processor suitable for our applications, (5)necessary software and operational policy for use of vector processors are discussed, and finally (6)a computer complex including vector processors is presented in this report. (author)

  13. Image communication scheme based on dynamic visual cryptography and computer generated holography

    Science.gov (United States)

    Palevicius, Paulius; Ragulskis, Minvydas

    2015-01-01

    Computer generated holograms are often exploited to implement optical encryption schemes. This paper proposes the integration of dynamic visual cryptography (an optical technique based on the interplay of visual cryptography and time-averaging geometric moiré) with Gerchberg-Saxton algorithm. A stochastic moiré grating is used to embed the secret into a single cover image. The secret can be visually decoded by a naked eye if only the amplitude of harmonic oscillations corresponds to an accurately preselected value. The proposed visual image encryption scheme is based on computer generated holography, optical time-averaging moiré and principles of dynamic visual cryptography. Dynamic visual cryptography is used both for the initial encryption of the secret image and for the final decryption. Phase data of the encrypted image are computed by using Gerchberg-Saxton algorithm. The optical image is decrypted using the computationally reconstructed field of amplitudes.

  14. Cloud Computing: The Future of Computing

    OpenAIRE

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  15. Design and Construction of a Brain-Like Computer: A New Class of Frequency-Fractal Computing Using Wireless Communication in a Supramolecular Organic, Inorganic System

    Directory of Open Access Journals (Sweden)

    Subrata Ghosh

    2014-01-01

    Full Text Available Here, we introduce a new class of computer which does not use any circuit or logic gate. In fact, no program needs to be written: it learns by itself and writes its own program to solve a problem. Gödel’s incompleteness argument is explored here to devise an engine where an astronomically large number of “if-then” arguments are allowed to grow by self-assembly, based on the basic set of arguments written in the system, thus, we explore the beyond Turing path of computing but following a fundamentally different route adopted in the last half-a-century old non-Turing adventures. Our hardware is a multilayered seed structure. If we open the largest seed, which is the final hardware, we find several computing seed structures inside, if we take any of them and open, there are several computing seeds inside. We design and synthesize the smallest seed, the entire multilayered architecture grows by itself. The electromagnetic resonance band of each seed looks similar, but the seeds of any layer shares a common region in its resonance band with inner and upper layer, hence a chain of resonance bands is formed (frequency fractal connecting the smallest to the largest seed (hence the name invincible rhythm or Ajeya Chhandam in Sanskrit. The computer solves intractable pattern search (Clique problem without searching, since the right pattern written in it spontaneously replies back to the questioner. To learn, the hardware filters any kind of sensory input image into several layers of images, each containing basic geometric polygons (fractal decomposition, and builds a network among all layers, multi-sensory images are connected in all possible ways to generate “if” and “then” argument. Several such arguments and decisions (phase transition from “if” to “then” self-assemble and form the two giant columns of arguments and rules of phase transition. Any input question is converted into a pattern as noted above, and these two

  16. A comparative analysis of soft computing techniques for gene prediction.

    Science.gov (United States)

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Compiling for Application Specific Computational Acceleration in Reconfigurable Architectures Final Report CRADA No. TSB-2033-01

    Energy Technology Data Exchange (ETDEWEB)

    De Supinski, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Caliga, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-28

    The primary objective of this project was to develop memory optimization technology to efficiently deliver data to, and distribute data within, the SRC-6's Field Programmable Gate Array- ("FPGA") based Multi-Adaptive Processors (MAPs). The hardware/software approach was to explore efficient MAP configurations and generate the compiler technology to exploit those configurations. This memory accessing technology represents an important step towards making reconfigurable symmetric multi-processor (SMP) architectures that will be a costeffective solution for large-scale scientific computing.

  18. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Glasser, Alan H. [Fusion Theory and Computation Inc., Kingston, WA (United States)

    2018-02-02

    Final technical report on DE-SC0016106. This is the final technical report for a portion of the multi-institutional CEMM project. This report is centered around 3 publications and a seminar presentation, which have been submitted to E-Link.

  19. Feasibility of Computer-Based Videogame Therapy for Children with Cerebral Palsy

    Science.gov (United States)

    Radtka, Sandra; Hone, Robert; Brown, Charles; Mastick, Judy; Melnick, Marsha E.

    2013-01-01

    Abstract Objectives Standing and gait balance problems are common in children with cerebral palsy (CP), resulting in falls and injuries. Task-oriented exercises to strengthen and stretch muscles that shift the center of mass and change the base of support are effective in improving balance. Gaming environments can be challenging and fun, encouraging children to engage in exercises at home. The aims of this project were to demonstrate the technical feasibility, ease of use, appeal, and safety of a computer-based videogame program designed to improve balance in children with CP. Materials and Methods This study represents a close collaboration between computer design and clinical team members. The first two phases were performed in the laboratory, and the final phase was done in subjects' homes. The prototype balance game was developed using computer-based real-time three-dimensional programming that enabled the team to capture engineering data necessary to tune the system. Videogame modifications, including identifying compensatory movements, were made in an iterative fashion based on feedback from subjects and observations of clinical and software team members. Results Subjects (n=14) scored the game 21.5 out of 30 for ease of use and appeal, 4.0 out of 5 for enjoyment, and 3.5 on comprehension. There were no safety issues, and the games performed without technical flaws in final testing. Conclusions A computer-based videogame incorporating therapeutic movements to improve gait and balance in children with CP was appealing and feasible for home use. A follow-up study examining its effectiveness in improving balance in children with CP is recommended. PMID:24761324

  20. The “Chimera”: An Off-The-Shelf CPU/GPGPU/FPGA Hybrid Computing Platform

    Directory of Open Access Journals (Sweden)

    Ra Inta

    2012-01-01

    Full Text Available The nature of modern astronomy means that a number of interesting problems exhibit a substantial computational bound and this situation is gradually worsening. Scientists, increasingly fighting for valuable resources on conventional high-performance computing (HPC facilities—often with a limited customizable user environment—are increasingly looking to hardware acceleration solutions. We describe here a heterogeneous CPU/GPGPU/FPGA desktop computing system (the “Chimera”, built with commercial-off-the-shelf components. We show that this platform may be a viable alternative solution to many common computationally bound problems found in astronomy, however, not without significant challenges. The most significant bottleneck in pipelines involving real data is most likely to be the interconnect (in this case the PCI Express bus residing on the CPU motherboard. Finally, we speculate on the merits of our Chimera system on the entire landscape of parallel computing, through the analysis of representative problems from UC Berkeley’s “Thirteen Dwarves.”

  1. Heterogeneous Gpu&Cpu Cluster For High Performance Computing In Cryptography

    Directory of Open Access Journals (Sweden)

    Michał Marks

    2012-01-01

    Full Text Available This paper addresses issues associated with distributed computing systems andthe application of mixed GPU&CPU technology to data encryption and decryptionalgorithms. We describe a heterogenous cluster HGCC formed by twotypes of nodes: Intel processor with NVIDIA graphics processing unit and AMDprocessor with AMD graphics processing unit (formerly ATI, and a novel softwareframework that hides the heterogeneity of our cluster and provides toolsfor solving complex scientific and engineering problems. Finally, we present theresults of numerical experiments. The considered case study is concerned withparallel implementations of selected cryptanalysis algorithms. The main goal ofthe paper is to show the wide applicability of the GPU&CPU technology tolarge scale computation and data processing.

  2. Structure functions and final-state properties in deeply inelastic electron-proton scattering

    International Nuclear Information System (INIS)

    Kharraziha, H.

    1997-01-01

    In this thesis, we give a description of the detailed structure of the proton and a description of the final-state properties in electron-proton scattering. Qualitative results, in a purely gluonic scenario with the leading log approximation, and quantitative results, where quarks are included and some sub-leading corrections have been made, are presented. The quantitative results are in fair agreement with available experimental data and a Monte Carlo event generator for electron-proton scattering is presented. Further, a computer program for calculating QCD colour factors is presented

  3. Computer code package RALLY for probabilistic safety assessment of large technical systems

    International Nuclear Information System (INIS)

    Gueldner, W.; Polke, H.; Spindler, H.; Zipf, G.

    1981-09-01

    This report describes the program system RALLY to compute the reliability of large and intermeshed technical systems. In addition to a short explanation of the different programs, the possible applications of the program system RALLY are demonstrated. Finally, the most important studies carried out so far on RALLY are discussed. (orig.) [de

  4. Infinite possibilities: Computational structures technology

    Science.gov (United States)

    Beam, Sherilee F.

    1994-12-01

    evolve. By using CST in the design and operation of future structures systems, engineers will have a better understanding of how a system responds and lasts, more cost-effective methods of designing and testing models, and improved productivity. For informational and educational purposes, a videotape is being produced using both static and dynamic images from research institutions, software and hardware companies, private individuals, and historical photographs and drawings. The extensive number of CST resources indicates its widespread use. Applications run the gamut from simpler university-simulated problems to those requiring solutions on supercomputers. In some cases, an image or an animation will be mapped onto the actual structure to show the relevance of the computer model to the structure. video production requirements, often the image loses some of its original digital quality and impact when transferred to videotape. Although many CST images are currently available, those that are edited into the final project must meet two important criteria: they must complement the narration, and they must be broadcast quality when recorded on videotape.

  5. Stress-intensity factors for surface cracks in pipes: a computer code for evaluation by use of influence functions. Final report

    International Nuclear Information System (INIS)

    Dedhia, D.D.; Harris, D.O.

    1982-06-01

    A user-oriented computer program for the evaluation of stress intensity factors for cracks in pipes is presented. Stress intensity factors for semi-elliptical, complete circumferential and long longitudinal cracks can be obtained using this computer program. The code is based on the method of influence functions which makes it possible to treat arbitrary stresses on the plane of the crack. The stresses on the crack plane can be entered as a mathematical or tabulated function. A user's manual is included in this report. Background information is also included

  6. The Observation of Bahasa Indonesia Official Computer Terms Implementation in Scientific Publication

    Science.gov (United States)

    Gunawan, D.; Amalia, A.; Lydia, M. S.; Muthaqin, M. I.

    2018-03-01

    The government of the Republic of Indonesia had issued a regulation to substitute computer terms in foreign language that have been used earlier into official computer terms in Bahasa Indonesia. This regulation was stipulated in Presidential Decree No. 2 of 2001 concerning the introduction of official computer terms in Bahasa Indonesia (known as Senarai Padanan Istilah/SPI). After sixteen years, people of Indonesia, particularly for academics, should have implemented the official computer terms in their official publications. This observation is conducted to discover the implementation of official computer terms usage in scientific publications which are written in Bahasa Indonesia. The data source used in this observation are the publications by the academics, particularly in computer science field. The method used in the observation is divided into four stages. The first stage is metadata harvesting by using Open Archive Initiative - Protocol for Metadata Harvesting (OAI-PMH). Second, converting the harvested document (in pdf format) to plain text. The third stage is text-preprocessing as the preparation of string matching. Then the final stage is searching the official computer terms based on 629 SPI terms by using Boyer-Moore algorithm. We observed that there are 240,781 foreign computer terms in 1,156 scientific publications from six universities. This result shows that the foreign computer terms are still widely used by the academics.

  7. A computational procedure for the dynamics of flexible beams within multibody systems. Ph.D. Thesis Final Technical Report

    Science.gov (United States)

    Downer, Janice Diane

    1990-01-01

    The dynamic analysis of three dimensional elastic beams which experience large rotational and large deformational motions are examined. The beam motion is modeled using an inertial reference for the translational displacements and a body-fixed reference for the rotational quantities. Finite strain rod theories are then defined in conjunction with the beam kinematic description which accounts for the effects of stretching, bending, torsion, and transverse shear deformations. A convected coordinate representation of the Cauchy stress tensor and a conjugate strain definition is introduced to model the beam deformation. To treat the beam dynamics, a two-stage modification of the central difference algorithm is presented to integrate the translational coordinates and the angular velocity vector. The angular orientation is then obtained from the application of an implicit integration algorithm to the Euler parameter/angular velocity kinematical relation. The combined developments of the objective internal force computation with the dynamic solution procedures result in the computational preservation of total energy for undamped systems. The present methodology is also extended to model the dynamics of deployment/retrieval of the flexible members. A moving spatial grid corresponding to the configuration of a deployed rigid beam is employed as a reference for the dynamic variables. A transient integration scheme which accurately accounts for the deforming spatial grid is derived from a space-time finite element discretization of a Hamiltonian variational statement. The computational results of this general deforming finite element beam formulation are compared to reported results for a planar inverse-spaghetti problem.

  8. Computational methods for protein identification from mass spectrometry data.

    Directory of Open Access Journals (Sweden)

    Leo McHugh

    2008-02-01

    Full Text Available Protein identification using mass spectrometry is an indispensable computational tool in the life sciences. A dramatic increase in the use of proteomic strategies to understand the biology of living systems generates an ongoing need for more effective, efficient, and accurate computational methods for protein identification. A wide range of computational methods, each with various implementations, are available to complement different proteomic approaches. A solid knowledge of the range of algorithms available and, more critically, the accuracy and effectiveness of these techniques is essential to ensure as many of the proteins as possible, within any particular experiment, are correctly identified. Here, we undertake a systematic review of the currently available methods and algorithms for interpreting, managing, and analyzing biological data associated with protein identification. We summarize the advances in computational solutions as they have responded to corresponding advances in mass spectrometry hardware. The evolution of scoring algorithms and metrics for automated protein identification are also discussed with a focus on the relative performance of different techniques. We also consider the relative advantages and limitations of different techniques in particular biological contexts. Finally, we present our perspective on future developments in the area of computational protein identification by considering the most recent literature on new and promising approaches to the problem as well as identifying areas yet to be explored and the potential application of methods from other areas of computational biology.

  9. Computer simulation of chemical nucleation

    International Nuclear Information System (INIS)

    Turner, J.S.

    1979-01-01

    The problem of nucleation at chemical instabilities is investigated by means of microscopic computer simulation. The first-order transition of interest involves a new kind of nucleation arising from chemical transformations rather than physical forces. Here it is the chemical state of matter, and not matter itself, which is spatially localized to form the nucleus for transition between different chemical states. First, the concepts of chemical instability, nonequilibrium phase transition, and dissipative structure are reviewed briefly. Then recently developed methods of reactive molecular dynamics are used to study chemical nucleation in a simple model chemical reactions. Finally, the connection of these studies to nucleation and condensation processes involving physical and chemical interactions is explored. (orig.)

  10. An agent-based computational model for tuberculosis spreading on age-structured populations

    Science.gov (United States)

    Graciani Rodrigues, C. C.; Espíndola, Aquino L.; Penna, T. J. P.

    2015-06-01

    In this work we present an agent-based computational model to study the spreading of the tuberculosis (TB) disease on age-structured populations. The model proposed is a merge of two previous models: an agent-based computational model for the spreading of tuberculosis and a bit-string model for biological aging. The combination of TB with the population aging, reproduces the coexistence of health states, as seen in real populations. In addition, the universal exponential behavior of mortalities curves is still preserved. Finally, the population distribution as function of age shows the prevalence of TB mostly in elders, for high efficacy treatments.

  11. Mathematical challenges from theoretical/computational chemistry

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-31

    The committee believes that this report has relevance and potentially valuable suggestions for a wide range of readers. Target audiences include: graduate departments in the mathematical and chemical sciences; federal and private agencies that fund research in the mathematical and chemical sciences; selected industrial and government research and development laboratories; developers of software and hardware for computational chemistry; and selected individual researchers. Chapter 2 of this report covers some history of computational chemistry for the nonspecialist, while Chapter 3 illustrates the fruits of some past successful cross-fertilization between mathematical scientists and computational/theoretical chemists. In Chapter 4 the committee has assembled a representative, but not exhaustive, survey of research opportunities. Most of these are descriptions of important open problems in computational/theoretical chemistry that could gain much from the efforts of innovative mathematical scientists, written so as to be accessible introductions to the nonspecialist. Chapter 5 is an assessment, necessarily subjective, of cultural differences that must be overcome if collaborative work is to be encouraged between the mathematical and the chemical communities. Finally, the report ends with a brief list of conclusions and recommendations that, if followed, could promote accelerated progress at this interface. Recognizing that bothersome language issues can inhibit prospects for collaborative research at the interface between distinctive disciplines, the committee has attempted throughout to maintain an accessible style, in part by using illustrative boxes, and has included at the end of the report a glossary of technical terms that may be familiar to only a subset of the target audiences listed above.

  12. Brain Computer Interfaces, a Review

    Directory of Open Access Journals (Sweden)

    Luis Fernando Nicolas-Alonso

    2012-01-01

    Full Text Available A brain-computer interface (BCI is a hardware and software communications system that permits cerebral activity alone to control computers or external devices. The immediate goal of BCI research is to provide communications capabilities to severely disabled people who are totally paralyzed or ‘locked in’ by neurological neuromuscular disorders, such as amyotrophic lateral sclerosis, brain stem stroke, or spinal cord injury. Here, we review the state-of-the-art of BCIs, looking at the different steps that form a standard BCI: signal acquisition, preprocessing or signal enhancement, feature extraction, classification and the control interface. We discuss their advantages, drawbacks, and latest advances, and we survey the numerous technologies reported in the scientific literature to design each step of a BCI. First, the review examines the neuroimaging modalities used in the signal acquisition step, each of which monitors a different functional brain activity such as electrical, magnetic or metabolic activity. Second, the review discusses different electrophysiological control signals that determine user intentions, which can be detected in brain activity. Third, the review includes some techniques used in the signal enhancement step to deal with the artifacts in the control signals and improve the performance. Fourth, the review studies some mathematic algorithms used in the feature extraction and classification steps which translate the information in the control signals into commands that operate a computer or other device. Finally, the review provides an overview of various BCI applications that control a range of devices.

  13. Brain Computer Interfaces, a Review

    Science.gov (United States)

    Nicolas-Alonso, Luis Fernando; Gomez-Gil, Jaime

    2012-01-01

    A brain-computer interface (BCI) is a hardware and software communications system that permits cerebral activity alone to control computers or external devices. The immediate goal of BCI research is to provide communications capabilities to severely disabled people who are totally paralyzed or ‘locked in’ by neurological neuromuscular disorders, such as amyotrophic lateral sclerosis, brain stem stroke, or spinal cord injury. Here, we review the state-of-the-art of BCIs, looking at the different steps that form a standard BCI: signal acquisition, preprocessing or signal enhancement, feature extraction, classification and the control interface. We discuss their advantages, drawbacks, and latest advances, and we survey the numerous technologies reported in the scientific literature to design each step of a BCI. First, the review examines the neuroimaging modalities used in the signal acquisition step, each of which monitors a different functional brain activity such as electrical, magnetic or metabolic activity. Second, the review discusses different electrophysiological control signals that determine user intentions, which can be detected in brain activity. Third, the review includes some techniques used in the signal enhancement step to deal with the artifacts in the control signals and improve the performance. Fourth, the review studies some mathematic algorithms used in the feature extraction and classification steps which translate the information in the control signals into commands that operate a computer or other device. Finally, the review provides an overview of various BCI applications that control a range of devices. PMID:22438708

  14. Computer-Assisted, Programmed Text, and Lecture Modes of Instruction in Three Medical Training Courses: Comparative Evaluation. Final Report.

    Science.gov (United States)

    Deignan, Gerard M.; And Others

    This report contains a comparative analysis of the differential effectiveness of computer-assisted instruction (CAI), programmed instructional text (PIT), and lecture methods of instruction in three medical courses--Medical Laboratory, Radiology, and Dental. The summative evaluation includes (1) multiple regression analyses conducted to predict…

  15. Interactive Computer-Enhanced Remote Viewing System (ICERVS): Final report, November 1994--September 1996

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-05-01

    The Interactive Computer-Enhanced Remote Viewing System (ICERVS) is a software tool for complex three-dimensional (3-D) visualization and modeling. Its primary purpose is to facilitate the use of robotic and telerobotic systems in remote and/or hazardous environments, where spatial information is provided by 3-D mapping sensors. ICERVS provides a robust, interactive system for viewing sensor data in 3-D and combines this with interactive geometric modeling capabilities that allow an operator to construct CAD models to match the remote environment. Part I of this report traces the development of ICERVS through three evolutionary phases: (1) development of first-generation software to render orthogonal view displays and wireframe models; (2) expansion of this software to include interactive viewpoint control, surface-shaded graphics, material (scalar and nonscalar) property data, cut/slice planes, color and visibility mapping, and generalized object models; (3) demonstration of ICERVS as a tool for the remediation of underground storage tanks (USTs) and the dismantlement of contaminated processing facilities. Part II of this report details the software design of ICERVS, with particular emphasis on its object-oriented architecture and user interface.

  16. Interactive Computer-Enhanced Remote Viewing System (ICERVS): Final report, November 1994--September 1996

    International Nuclear Information System (INIS)

    1997-01-01

    The Interactive Computer-Enhanced Remote Viewing System (ICERVS) is a software tool for complex three-dimensional (3-D) visualization and modeling. Its primary purpose is to facilitate the use of robotic and telerobotic systems in remote and/or hazardous environments, where spatial information is provided by 3-D mapping sensors. ICERVS provides a robust, interactive system for viewing sensor data in 3-D and combines this with interactive geometric modeling capabilities that allow an operator to construct CAD models to match the remote environment. Part I of this report traces the development of ICERVS through three evolutionary phases: (1) development of first-generation software to render orthogonal view displays and wireframe models; (2) expansion of this software to include interactive viewpoint control, surface-shaded graphics, material (scalar and nonscalar) property data, cut/slice planes, color and visibility mapping, and generalized object models; (3) demonstration of ICERVS as a tool for the remediation of underground storage tanks (USTs) and the dismantlement of contaminated processing facilities. Part II of this report details the software design of ICERVS, with particular emphasis on its object-oriented architecture and user interface

  17. Some research advances in computer graphics that will enhance applications to engineering design

    Science.gov (United States)

    Allan, J. J., III

    1975-01-01

    Research in man/machine interactions and graphics hardware/software that will enhance applications to engineering design was described. Research aspects of executive systems, command languages, and networking used in the computer applications laboratory are mentioned. Finally, a few areas where little or no research is being done were identified.

  18. Single instruction computer architecture and its application in image processing

    Science.gov (United States)

    Laplante, Phillip A.

    1992-03-01

    A single processing computer system using only half-adder circuits is described. In addition, it is shown that only a single hard-wired instruction is needed in the control unit to obtain a complete instruction set for this general purpose computer. Such a system has several advantages. First it is intrinsically a RISC machine--in fact the 'ultimate RISC' machine. Second, because only a single type of logic element is employed the entire computer system can be easily realized on a single, highly integrated chip. Finally, due to the homogeneous nature of the computer's logic elements, the computer has possible implementations as an optical or chemical machine. This in turn suggests possible paradigms for neural computing and artificial intelligence. After showing how we can implement a full-adder, min, max and other operations using the half-adder, we use an array of such full-adders to implement the dilation operation for two black and white images. Next we implement the erosion operation of two black and white images using a relative complement function and the properties of erosion and dilation. This approach was inspired by papers by van der Poel in which a single instruction is used to furnish a complete set of general purpose instructions and by Bohm- Jacopini where it is shown that any problem can be solved using a Turing machine with one entry and one exit.

  19. The advanced computational testing and simulation toolkit (ACTS)

    International Nuclear Information System (INIS)

    Drummond, L.A.; Marques, O.

    2002-01-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  20. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  1. [Computed tomography with computer-assisted detection of pulmonary nodules in dogs and cats].

    Science.gov (United States)

    Niesterok, C; Piesnack, S; Köhler, C; Ludewig, E; Alef, M; Kiefer, I

    2015-01-01

    The aim of this study was to assess the potential benefit of computer-assisted detection (CAD) of pulmonary nodules in veterinary medicine. Therefore, the CAD rate was compared to the detection rates of two individual examiners in terms of its sensitivity and false-positive findings. We included 51 dogs and 16 cats with pulmonary nodules previously diagnosed by computed tomography. First, the number of nodules ≥ 3 mm was recorded for each patient by two independent examiners. Subsequently, each examiner used the CAD software for automated nodule detection. With the knowledge of the CAD results, a final consensus decision on the number of nodules was achieved. The software used was a commercially available CAD program. The sensitivity of examiner 1 was 89.2%, while that of examiner 2 reached 87.4%. CAD had a sensitivity of 69.4%. With CAD, the sensitivity of examiner 1 increased to 94.7% and that of examiner 2 to 90.8%. The CAD-system, which we used in our study, had a moderate sensitivity of 69.4%. Despite its severe limitations, with a high level of false-positive and false-negative results, CAD increased the examiners' sensitivity. Therefore, its supportive role in diagnostics appears to be evident.

  2. The Development of an Individualized Instructional Program in Beginning College Mathematics Utilizing Computer Based Resource Units. Final Report.

    Science.gov (United States)

    Rockhill, Theron D.

    Reported is an attempt to develop and evaluate an individualized instructional program in pre-calculus college mathematics. Four computer based resource units were developed in the areas of set theory, relations and function, algebra, trigonometry, and analytic geometry. Objectives were determined by experienced calculus teachers, and…

  3. Self-correcting quantum computers

    International Nuclear Information System (INIS)

    Bombin, H; Chhajlany, R W; Horodecki, M; Martin-Delgado, M A

    2013-01-01

    Is the notion of a quantum computer (QC) resilient to thermal noise unphysical? We address this question from a constructive perspective and show that local quantum Hamiltonian models provide self-correcting QCs. To this end, we first give a sufficient condition on the connectedness of excitations for a stabilizer code model to be a self-correcting quantum memory. We then study the two main examples of topological stabilizer codes in arbitrary dimensions and establish their self-correcting capabilities. Also, we address the transversality properties of topological color codes, showing that six-dimensional color codes provide a self-correcting model that allows the transversal and local implementation of a universal set of operations in seven spatial dimensions. Finally, we give a procedure for initializing such quantum memories at finite temperature. (paper)

  4. The All-or-Nothing Anti-Theft Policy - Theft Protection for Pervasive Computing

    DEFF Research Database (Denmark)

    Pedersen, Michael Østergaard; Pagter, Jakob Illeborg

    2007-01-01

    In many application scenarios for pervasive computing, theft is a serious security threat. In this paper we present the All-Or-Nothing anti-theft policy aimed at providing theft protection for pervasive computing. The overall idea behind the All-Or-Nothing anti-theft policy is to chain devices...... together in friendly networks so that any device will only work when it can see all of its friends. Thus a thief will have to keep the network of friendly devices together even if he only desires to steal one of the devices. Otherwise the device will not work. We provide a detailed security policy, present...... the required cryptographic protocols, provide three different applications, and finally we document that the policy is suitable for implementation on typical pervasive computing devices....

  5. Thai Language Sentence Similarity Computation Based on Syntactic Structure and Semantic Vector

    Science.gov (United States)

    Wang, Hongbin; Feng, Yinhan; Cheng, Liang

    2018-03-01

    Sentence similarity computation plays an increasingly important role in text mining, Web page retrieval, machine translation, speech recognition and question answering systems. Thai language as a kind of resources scarce language, it is not like Chinese language with HowNet and CiLin resources. So the Thai sentence similarity research faces some challenges. In order to solve this problem of the Thai language sentence similarity computation. This paper proposes a novel method to compute the similarity of Thai language sentence based on syntactic structure and semantic vector. This method firstly uses the Part-of-Speech (POS) dependency to calculate two sentences syntactic structure similarity, and then through the word vector to calculate two sentences semantic similarity. Finally, we combine the two methods to calculate two Thai language sentences similarity. The proposed method not only considers semantic, but also considers the sentence syntactic structure. The experiment result shows that this method in Thai language sentence similarity computation is feasible.

  6. Preaching What We Practice: Teaching Ethical Decision-Making to Computer Security Professionals

    Science.gov (United States)

    Fleischmann, Kenneth R.

    The biggest challenge facing computer security researchers and professionals is not learning how to make ethical decisions; rather it is learning how to recognize ethical decisions. All too often, technology development suffers from what Langdon Winner terms technological somnambulism - we sleepwalk through our technology design, following past precedents without a second thought, and fail to consider the perspectives of other stakeholders [1]. Computer security research and practice involves a number of opportunities for ethical decisions. For example, decisions about whether or not to automatically provide security updates involve tradeoffs related to caring versus user autonomy. Decisions about online voting include tradeoffs between convenience and security. Finally, decisions about routinely screening e-mails for spam involve tradeoffs of efficiency and privacy. It is critical that these and other decisions facing computer security researchers and professionals are confronted head on as value-laden design decisions, and that computer security researchers and professionals consider the perspectives of various stakeholders in making these decisions.

  7. On Writing and Reading Artistic Computational Ecosystems.

    Science.gov (United States)

    Antunes, Rui Filipe; Leymarie, Frederic Fol; Latham, William

    2015-01-01

    We study the use of the generative systems known as computational ecosystems to convey artistic and narrative aims. These are virtual worlds running on computers, composed of agents that trade units of energy and emulate cycles of life and behaviors adapted from biological life forms. In this article we propose a conceptual framework in order to understand these systems, which are involved in processes of authorship and interpretation that this investigation analyzes in order to identify critical instruments for artistic exploration. We formulate a model of narrative that we call system stories (after Mitchell Whitelaw), characterized by the dynamic network of material and conceptual processes that define these artefacts. They account for narrative constellations with multiple agencies from which meaning and messages emerge. Finally, we present three case studies to explore the potential of this model within an artistic and generative domain, arguing that this understanding expands and enriches the palette of the language of these systems.

  8. Computational mechanics of nonlinear response of shells

    Energy Technology Data Exchange (ETDEWEB)

    Kraetzig, W.B. (Bochum Univ. (Germany, F.R.). Inst. fuer Statik und Dynamik); Onate, E. (Universidad Politecnica de Cataluna, Barcelona (Spain). Escuela Tecnica Superior de Ingenieros de Caminos) (eds.)

    1990-01-01

    Shell structures and their components are utilized in a wide spectrum of engineering fields reaching from space and aircraft structures, pipes and pressure vessels over liquid storage tanks, off-shore installations, cooling towers and domes, to bodyworks of motor vehicles. Of continuously increasing importance is their nonlinear behavior, in which large deformations and large rotations are involved as well as nonlinear material properties. The book starts with a survey about nonlinear shell theories from the rigorous point of view of continuum mechanics, this starting point being unavoidable for modern computational concepts. There follows a series of papers on nonlinear, especially unstable shell responses, which draw computational connections to well established tools in the field of static and dynamic stability of systems. Several papers are then concerned with new finite element derivations for nonlinear shell problems, and finally a series of authors contribute to specific applications opening a small window of the above mentioned wide spectrum. (orig./HP) With 159 figs.

  9. Computational mechanics of nonlinear response of shells

    International Nuclear Information System (INIS)

    Kraetzig, W.B.; Onate, E.

    1990-01-01

    Shell structures and their components are utilized in a wide spectrum of engineering fields reaching from space and aircraft structures, pipes and pressure vessels over liquid storage tanks, off-shore installations, cooling towers and domes, to bodyworks of motor vehicles. Of continuously increasing importance is their nonlinear behavior, in which large deformations and large rotations are involved as well as nonlinear material properties. The book starts with a survey about nonlinear shell theories from the rigorous point of view of continuum mechanics, this starting point being unavoidable for modern computational concepts. There follows a series of papers on nonlinear, especially unstable shell responses, which draw computational connections to well established tools in the field of static and dynamic stability of systems. Several papers are then concerned with new finite element derivations for nonlinear shell problems, and finally a series of authors contribute to specific applications opening a small window of the above mentioned wide spectrum. (orig./HP) With 159 figs

  10. Fluid-Thermal-Structural Coupled Analysis of a Radial Inflow Micro Gas Turbine Using Computational Fluid Dynamics and Computational Solid Mechanics

    Directory of Open Access Journals (Sweden)

    Yonghui Xie

    2014-01-01

    Full Text Available A three-dimensional fluid-thermal-structural coupled analysis for a radial inflow micro gas turbine is conducted. First, a fluid-thermal coupled analysis of the flow and temperature fields of the nozzle passage and the blade passage is performed by using computational fluid dynamics (CFD. The flow and heat transfer characteristics of different sections are analyzed in detail. The thermal load and the aerodynamic load are then obtained from the temperature field and the pressure distribution. The stress distributions of the blade are finally studied by using computational solid mechanics (CSM considering three cases of loads: thermal load, aerodynamics load combined with centrifugal load, and all the three types of loads. The detailed parameters of the flow, temperature, and the stress are obtained and analyzed. The numerical results obtained provide a useful knowledge base for further exploration of radial gas turbine design.

  11. PC as physics computer for LHC?

    International Nuclear Information System (INIS)

    Jarp, Sverre; Simmins, Antony; Tang, Hong

    1996-01-01

    In the last five years, we have seen RISC workstations take over the computing scene that was once controlled by mainframes and supercomputers. In this paper we will argue that the same phenomenon might happen again. A project, active since March this year in the Physics Data Processing group of CERN's CN division is described where ordinary desktop PCs running Windows (NT and 3.11) have been used for creating an environment for running large LHC batch jobs (initially the DICE simulation job of Atlas). The problems encountered in porting both the CERN library and the specific Atlas codes are described together with some encouraging benchmark results when comparing to existing to existing RISC workstation in use by the Atlas collaboration. The issues of establishing the batch environment (Batch monitor, staging software, etc) are also covered. Finally a quick extrapolation of commodity computing power available in the future is touched upon to indicate what kind of cost envelope could be sufficient for the simulation farms required by the LHC experiments. (author)

  12. Computation within the auxiliary field approach

    International Nuclear Information System (INIS)

    Baeurle, S.A.

    2003-01-01

    Recently, the classical auxiliary field methodology has been developed as a new simulation technique for performing calculations within the framework of classical statistical mechanics. Since the approach suffers from a sign problem, a judicious choice of the sampling algorithm, allowing a fast statistical convergence and an efficient generation of field configurations, is of fundamental importance for a successful simulation. In this paper we focus on the computational aspects of this simulation methodology. We introduce two different types of algorithms, the single-move auxiliary field Metropolis Monte Carlo algorithm and two new classes of force-based algorithms, which enable multiple-move propagation. In addition, to further optimize the sampling, we describe a preconditioning scheme, which permits to treat each field degree of freedom individually with regard to the evolution through the auxiliary field configuration space. Finally, we demonstrate the validity and assess the competitiveness of these algorithms on a representative practical example. We believe that they may also provide an interesting possibility for enhancing the computational efficiency of other auxiliary field methodologies

  13. Pc as Physics Computer for Lhc ?

    Science.gov (United States)

    Jarp, Sverre; Simmins, Antony; Tang, Hong; Yaari, R.

    In the last five years, we have seen RISC workstations take over the computing scene that was once controlled by mainframes and supercomputers. In this paper we will argue that the same phenomenon might happen again. A project, active since March this year in the Physics Data Processing group, of CERN's CN division is described where ordinary desktop PCs running Windows (NT and 3.11) have been used for creating an environment for running large LHC batch jobs (initially the DICE simulation job of Atlas). The problems encountered in porting both the CERN library and the specific Atlas codes are described together with some encouraging benchmark results when comparing to existing RISC workstations in use by the Atlas collaboration. The issues of establishing the batch environment (Batch monitor, staging software, etc.) are also covered. Finally a quick extrapolation of commodity computing power available in the future is touched upon to indicate what kind of cost envelope could be sufficient for the simulation farms required by the LHC experiments.

  14. Development and validation of Monte Carlo dose computations for contrast-enhanced stereotactic synchrotron radiation therapy

    International Nuclear Information System (INIS)

    Vautrin, M.

    2011-01-01

    Contrast-enhanced stereotactic synchrotron radiation therapy (SSRT) is an innovative technique based on localized dose-enhancement effects obtained by reinforced photoelectric absorption in the tumor. Medium energy monochromatic X-rays (50 - 100 keV) are used for irradiating tumors previously loaded with a high-Z element. Clinical trials of SSRT are being prepared at the European Synchrotron Radiation Facility (ESRF), an iodinated contrast agent will be used. In order to compute the energy deposited in the patient (dose), a dedicated treatment planning system (TPS) has been developed for the clinical trials, based on the ISOgray TPS. This work focuses on the SSRT specific modifications of the TPS, especially to the PENELOPE-based Monte Carlo dose engine. The TPS uses a dedicated Monte Carlo simulation of medium energy polarized photons to compute the deposited energy in the patient. Simulations are performed considering the synchrotron source, the modeled beamline geometry and finally the patient. Specific materials were also implemented in the voxelized geometry of the patient, to consider iodine concentrations in the tumor. The computation process has been optimized and parallelized. Finally a specific computation of absolute doses and associated irradiation times (instead of monitor units) was implemented. The dedicated TPS was validated with depth dose curves, dose profiles and absolute dose measurements performed at the ESRF in a water tank and solid water phantoms with or without bone slabs. (author) [fr

  15. Computer-aided Nonlinear Control System Design Using Describing Function Models

    CERN Document Server

    Nassirharand, Amir

    2012-01-01

    A systematic computer-aided approach provides a versatile setting for the control engineer to overcome the complications of controller design for highly nonlinear systems. Computer-aided Nonlinear Control System Design provides such an approach based on the use of describing functions. The text deals with a large class of nonlinear systems without restrictions on the system order, the number of inputs and/or outputs or the number, type or arrangement of nonlinear terms. The strongly software-oriented methods detailed facilitate fulfillment of tight performance requirements and help the designer to think in purely nonlinear terms, avoiding the expedient of linearization which can impose substantial and unrealistic model limitations and drive up the cost of the final product. Design procedures are presented in a step-by-step algorithmic format each step being a functional unit with outputs that drive the other steps. This procedure may be easily implemented on a digital computer with example problems from mecha...

  16. Analysis of stationary availability factor of two-level backbone computer networks with arbitrary topology

    Science.gov (United States)

    Rahman, P. A.

    2018-05-01

    This scientific paper deals with the two-level backbone computer networks with arbitrary topology. A specialized method, offered by the author for calculation of the stationary availability factor of the two-level backbone computer networks, based on the Markov reliability models for the set of the independent repairable elements with the given failure and repair rates and the methods of the discrete mathematics, is also discussed. A specialized algorithm, offered by the author for analysis of the network connectivity, taking into account different kinds of the network equipment failures, is also observed. Finally, this paper presents an example of calculation of the stationary availability factor for the backbone computer network with the given topology.

  17. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    Energy Technology Data Exchange (ETDEWEB)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.; McNamara, Laura A.; Trucano, Timothy Guy

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model

  18. Neural Computation and the Computational Theory of Cognition

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  19. A subtraction scheme for computing QCD jet cross sections at NNLO: integrating the doubly unresolved subtraction terms

    CERN Document Server

    Somogyi, Gabor

    2013-01-01

    We finish the definition of a subtraction scheme for computing NNLO corrections to QCD jet cross sections. In particular, we perform the integration of the soft-type contributions to the doubly unresolved counterterms via the method of Mellin-Barnes representations. With these final ingredients in place, the definition of the scheme is complete and the computation of the regularised doubly virtual contribution to the NNLO cross section becomes feasible.

  20. The LiveWire Project final report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, C.D.; Nelson, T.T. [Enova Technology, San Diego, CA (United States); Kelly, J.C.; Dominguez, H.A. [Paragon Consulting Services, La Verne, CA (United States)

    1997-10-01

    Utilities across the US have begun pilot testing a variety of hardware and software products to develop a two-way communications system between themselves and their customers. Their purpose is to reduce utility operating costs and to provide new and improved services for customers in light of pending changes in the electric industry being brought about by deregulation. A consortium including utilities, national labs, consultants, and contractors, with the support of the Department of Energy (DOE) and the Electric Power Research Institute (EPRI), initiated a project that utilized a hybrid fiber-coax (HFC) wide-area network integrated with a CEBus based local area network within the customers home. The system combined energy consumption data taken within the home, and home automation features to provide a suite of energy management services for residential customers. The information was transferred via the Internet through the HFC network, and presented to the customer on their personal computer. This final project report discusses the design, prototype testing, and system deployment planning of the energy management system.

  1. Use on non-conjugate prior distributions in compound failure models. Final technical report

    International Nuclear Information System (INIS)

    Shultis, J.K.; Johnson, D.E.; Milliken, G.A.; Eckhoff, N.D.

    1981-12-01

    Several theoretical and computational techniques are presented for compound failure models in which the failure rate or failure probability for a class of components is considered to be a random variable. Both the failure-on-demand and failure-rate situation are considered. Ten different prior families are presented for describing the variation or uncertainty of the failure parameter. Methods considered for estimating values for the prior parameters from a given set of failure data are (1) matching data moments to those of the prior distribution, (2) matching data moments to those of the compound marginal distribution, and (3) the marginal maximum likelihood method. Numerical methods for computing the parameter estimators for all ten prior families are presented, as well as methods for obtaining estimates of the variances and covariance of the parameter estimators, it is shown that various confidence, probability, and tolerance intervals can be evaluated. Finally, to test the resulting failure models against the given failure data, generalized chi-squage and Kolmogorov-Smirnov goodness-of-fit tests are proposed together with a test to eliminate outliers from the failure data. Computer codes based on the results presented here have been prepared and are presented in a companion report

  2. Study of an analog/logic processor for the design of an auto patch hybrid computer

    International Nuclear Information System (INIS)

    Koched, Hassen

    1976-01-01

    This paper presents the experimental study of an analog multiprocessor designed at SES/CEN-Saclay. An application of such a device as a basic component of an auto-patch hybrid computer is presented. First, the description of the processor, and a presentation of the theoretical concepts which governed the design of the processor are given. Experiments on an hybrid computer are then presented. Finally, different systems of automatic patching are presented, and conveniently modified, for the use of such a processor. (author) [fr

  3. Final Stage Development of Reactor Console Simulator

    International Nuclear Information System (INIS)

    Mohamad Idris Taib; Ridzuan Abdul Mutalib; Zareen Khan Abdul Jalil Khan; Mohd Khairulezwan Abdul Manan; Mohd Sabri Minhat; Nurfarhana Ayuni Joha

    2013-01-01

    The Reactor Console Simulator PUSPATI TRIGA Reactor was developed since end of 2011 and now in the final stage of development. It is will be an interactive tool for operator training and teaching of PUSPATI TRIGA Reactor. Behavior and characteristic for reactor console and reactor itself can be evaluated and understand. This Simulator will be used as complement for actual present reactor console. Implementation of human system interface (HSI) is using computer screens, keyboard and mouse. Multiple screens are used to match the physical of present reactor console. LabVIEW software are using for user interface and mathematical calculation. Polynomial equation based on control rods calibration data as well as operation parameters record was used to calculate and estimated reactor console parameters. The capabilities in user interface, reactor physics and thermal-hydraulics can be expanded and explored to simulation as well as modeling for New Reactor Console, Research Reactor and Nuclear Power Plant. (author)

  4. Dosimetry in abdominal imaging by 6-slice computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, Sonia Isabel [Hospital de Faro, EPE (Portugal); Abrantes, Antonio Fernando; Ribeiro, Luis Pedro; Almeida, Rui Pedro Pereira [University of Algarve (Portugal). School of Health. Dept. of Radiology

    2012-11-15

    Objective: To determine the effective dose in abdominal computed tomography imaging and to study the influence of patients' characteristics on the received dose. Materials and Methods: Dose values measurements were performed with an ionization chamber on phantoms to check the agreement between dose values and those presented by the computed tomography apparatus, besides their compliance with the recommended reference dose levels. Later, values of dose received by physically able patients submitted to abdominal computed tomography (n = 100) were measured and correlated with their anthropometric characteristics. Finally, the dose to organs was simulated with the Monte Carlo method using the CT-Expo V 1.5 software, and the effect of automatic exposure control on such examinations. Results: The main characteristics directly influencing the dose include the patients' body mass, abdominal perimeter and body mass index, whose correlation is linear and positive. Conclusion: The radiation dose received from abdominal CT scans depends on some patient's characteristics, and it is important to adjust the acquisition parameters to their dimensions (author)

  5. Computational steering of GEM based detector simulations

    Science.gov (United States)

    Sheharyar, Ali; Bouhali, Othmane

    2017-10-01

    Gas based detector R&D relies heavily on full simulation of detectors and their optimization before final prototypes can be built and tested. These simulations in particular those with complex scenarios such as those involving high detector voltages or gas with larger gains are computationally intensive may take several days or weeks to complete. These long-running simulations usually run on the high-performance computers in batch mode. If the results lead to unexpected behavior, then the simulation might be rerun with different parameters. However, the simulations (or jobs) may have to wait in a queue until they get a chance to run again because the supercomputer is a shared resource that maintains a queue of other user programs as well and executes them as time and priorities permit. It may result in inefficient resource utilization and increase in the turnaround time for the scientific experiment. To overcome this issue, the monitoring of the behavior of a simulation, while it is running (or live), is essential. In this work, we employ the computational steering technique by coupling the detector simulations with a visualization package named VisIt to enable the exploration of the live data as it is produced by the simulation.

  6. Polytopol computing for multi-core and distributed systems

    Science.gov (United States)

    Spaanenburg, Henk; Spaanenburg, Lambert; Ranefors, Johan

    2009-05-01

    Multi-core computing provides new challenges to software engineering. The paper addresses such issues in the general setting of polytopol computing, that takes multi-core problems in such widely differing areas as ambient intelligence sensor networks and cloud computing into account. It argues that the essence lies in a suitable allocation of free moving tasks. Where hardware is ubiquitous and pervasive, the network is virtualized into a connection of software snippets judiciously injected to such hardware that a system function looks as one again. The concept of polytopol computing provides a further formalization in terms of the partitioning of labor between collector and sensor nodes. Collectors provide functions such as a knowledge integrator, awareness collector, situation displayer/reporter, communicator of clues and an inquiry-interface provider. Sensors provide functions such as anomaly detection (only communicating singularities, not continuous observation), they are generally powered or self-powered, amorphous (not on a grid) with generation-and-attrition, field re-programmable, and sensor plug-and-play-able. Together the collector and the sensor are part of the skeleton injector mechanism, added to every node, and give the network the ability to organize itself into some of many topologies. Finally we will discuss a number of applications and indicate how a multi-core architecture supports the security aspects of the skeleton injector.

  7. 78 FR 75942 - Certain Mobile Phones and Tablet Computers, and Components Thereof; Commission Determination To...

    Science.gov (United States)

    2013-12-13

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-847] Certain Mobile Phones and Tablet Computers, and Components Thereof; Commission Determination To Review in Part a Final Initial Determination... Qualcomm Magellan and Odyssey transceiver chips have become a de facto standard in the mobile devices...

  8. Broadband computation of the scattering coefficients of infinite arbitrary cylinders.

    Science.gov (United States)

    Blanchard, Cédric; Guizal, Brahim; Felbacq, Didier

    2012-07-01

    We employ a time-domain method to compute the near field on a contour enclosing infinitely long cylinders of arbitrary cross section and constitution. We therefore recover the cylindrical Hankel coefficients of the expansion of the field outside the circumscribed circle of the structure. The recovered coefficients enable the wideband analysis of complex systems, e.g., the determination of the radar cross section becomes straightforward. The prescription for constructing such a numerical tool is provided in great detail. The method is validated by computing the scattering coefficients for a homogeneous circular cylinder illuminated by a plane wave, a problem for which an analytical solution exists. Finally, some radiation properties of an optical antenna are examined by employing the proposed technique.

  9. Projection matrix acquisition for cone-beam computed tomography iterative reconstruction

    Science.gov (United States)

    Yang, Fuqiang; Zhang, Dinghua; Huang, Kuidong; Shi, Wenlong; Zhang, Caixin; Gao, Zongzhao

    2017-02-01

    Projection matrix is an essential and time-consuming part in computed tomography (CT) iterative reconstruction. In this article a novel calculation algorithm of three-dimensional (3D) projection matrix is proposed to quickly acquire the matrix for cone-beam CT (CBCT). The CT data needed to be reconstructed is considered as consisting of the three orthogonal sets of equally spaced and parallel planes, rather than the individual voxels. After getting the intersections the rays with the surfaces of the voxels, the coordinate points and vertex is compared to obtain the index value that the ray traversed. Without considering ray-slope to voxel, it just need comparing the position of two points. Finally, the computer simulation is used to verify the effectiveness of the algorithm.

  10. Students Upgrading through Computer and Career Education System Services (Project SUCCESS). Final Evaluation Report 1992-93. OER Report.

    Science.gov (United States)

    New York City Board of Education, Brooklyn, NY. Office of Educational Research.

    Student Upgrading through Computer and Career Education System Services (Project SUCCESS) was an Elementary and Secondary Education Act Title VII-funded project in its third year of operation. Project SUCCESS served 460 students of limited English proficiency at two high schools in Brooklyn and one high school in Manhattan (New York City).…

  11. Adiabatic approximation with exponential accuracy for many-body systems and quantum computation

    International Nuclear Information System (INIS)

    Lidar, Daniel A.; Rezakhani, Ali T.; Hamma, Alioscia

    2009-01-01

    We derive a version of the adiabatic theorem that is especially suited for applications in adiabatic quantum computation, where it is reasonable to assume that the adiabatic interpolation between the initial and final Hamiltonians is controllable. Assuming that the Hamiltonian is analytic in a finite strip around the real-time axis, that some number of its time derivatives vanish at the initial and final times, and that the target adiabatic eigenstate is nondegenerate and separated by a gap from the rest of the spectrum, we show that one can obtain an error between the final adiabatic eigenstate and the actual time-evolved state which is exponentially small in the evolution time, where this time itself scales as the square of the norm of the time derivative of the Hamiltonian divided by the cube of the minimal gap.

  12. Computer technology applications in industrial and organizational psychology.

    Science.gov (United States)

    Crespin, Timothy R; Austin, James T

    2002-08-01

    This article reviews computer applications developed and utilized by industrial-organizational (I-O) psychologists, both in practice and in research. A primary emphasis is on applications developed for Internet usage, because this "network of networks" changes the way I-O psychologists work. The review focuses on traditional and emerging topics in I-O psychology. The first topic involves information technology applications in measurement, defined broadly across levels of analysis (persons, groups, organizations) and domains (abilities, personality, attitudes). Discussion then focuses on individual learning at work, both in formal training and in coping with continual automation of work. A section on job analysis follows, illustrating the role of computers and the Internet in studying jobs. Shifting focus to the group level of analysis, we briefly review how information technology is being used to understand and support cooperative work. Finally, special emphasis is given to the emerging "third discipline" in I-O psychology research-computational modeling of behavioral events in organizations. Throughout this review, themes of innovation and dissemination underlie a continuum between research and practice. The review concludes by setting a framework for I-O psychology in a computerized and networked world.

  13. Computing Programs for Determining Traffic Flows from Roundabouts

    Science.gov (United States)

    Boroiu, A. A.; Tabacu, I.; Ene, A.; Neagu, E.; Boroiu, A.

    2017-10-01

    For modelling road traffic at the level of a road network it is necessary to specify the flows of all traffic currents at each intersection. These data can be obtained by direct measurements at the traffic light intersections, but in the case of a roundabout this is not possible directly and the literature as well as the traffic modelling software doesn’t offer ways to solve this issue. Two sets of formulas are proposed by which all traffic flows from the roundabouts with 3 or 4 arms are calculated based on the streams that can be measured. The objective of this paper is to develop computational programs to operate with these formulas. For each of the two sets of analytical relations, a computational program was developed in the Java operating language. The obtained results fully confirm the applicability of the calculation programs. The final stage for capitalizing these programs will be to make them web pages in HTML format, so that they can be accessed and used on the Internet. The achievements presented in this paper are an important step to provide a necessary tool for traffic modelling because these computational programs can be easily integrated into specialized software.

  14. Integration of adaptive process control with computational simulation for spin-forming

    International Nuclear Information System (INIS)

    Raboin, P. J. LLNL

    1998-01-01

    Improvements in spin-forming capabilities through upgrades to a metrology and machine control system and advances in numerical simulation techniques were studied in a two year project funded by Laboratory Directed Research and Development (LDRD) at Lawrence Livermore National Laboratory. Numerical analyses were benchmarked with spin-forming experiments and computational speeds increased sufficiently to now permit actual part forming simulations. Extensive modeling activities examined the simulation speeds and capabilities of several metal forming computer codes for modeling flat plate and cylindrical spin-forming geometries. Shape memory research created the first numerical model to describe this highly unusual deformation behavior in Uranium alloys. A spin-forming metrology assessment led to sensor and data acquisition improvements that will facilitate future process accuracy enhancements, such as a metrology frame. Finally, software improvements (SmartCAM) to the manufacturing process numerically integrate the part models to the spin-forming process and to computational simulations

  15. Mesh influence on the fire computer modeling in nuclear power plants

    Directory of Open Access Journals (Sweden)

    D. Lázaro

    2018-04-01

    Full Text Available Fire computer models allow to study real fire scenarios consequences. Its use in nuclear power plants has increased with the new regulations to apply risk informed performance-based methods for the analysis and design of fire safety solutions. The selection of the cell side factor is very important in these kinds of models. The mesh must establish a compromise between the geometry adjustment, the resolution of the equations and the computation times. This paper aims to study the impact of several cell sizes, using the fire computer model FDS, to evaluate the relative affectation in the final simulation results. In order to validate that, we have employed several scenarios of interest for nuclear power plants. Conclusions offer relevant data for users and show some cell sizes that can be selected to guarantee the quality of the simulations and reduce the results uncertainty.

  16. Computer-aided design and computer science technology

    Science.gov (United States)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  17. Radiation Tolerance Qualification Tests of the Final Source Interface Unit for the ALICE Experiment

    CERN Document Server

    Dénes, E; Futó, E; Kerék, A; Kiss, T; Molnár, J; Novák, D; Soós, C; Tölyhi, T; Van de Vyvre, P

    2007-01-01

    The ALICE Detector Data Link (DDL) is a high-speed optical link designed to interface the readout electronics of ALICE sub-detectors to the DAQ computers. The Source Interface Unit (SIU) of the DDL will operate in radiation environment. Previous tests showed that a configuration loss of SRAM-based FPGA devices may happen and the frequency of undetected data errors in the FPGA user memory area is also not acceptable. Therefore, we redesigned the SIU card using another FPGA based on flash technology. In order to detect bit errors in the user memory we added parity check logic to the design. The new SIU has been extensively tested using neutron and proton irradiation to verify its radiation tolerance. In this paper we summarize the design changes, introduce the final design, and the results of the radiation tolerance measurements on the final card.

  18. COMPUTER SIMULATION THE MECHANICAL MOVEMENT BODY BY MEANS OF MATHCAD

    Directory of Open Access Journals (Sweden)

    Leonid Flehantov

    2017-03-01

    Full Text Available Here considered the technique of using computer mathematics system MathCAD for computer implementation of mathematical model of the mechanical motion of the physical body thrown at an angle to the horizon, and its use for educational computer simulation experiment in teaching the fundamentals of mathematical modeling. The advantages of MathCAD as environment of implementation mathematical models in the second stage of higher education are noted. It describes the creation the computer simulation model that allows you to comprehensively analyze the process of mechanical movement of the body, changing the input parameters of the model: the acceleration of gravity, the initial and final position of the body, the initial velocity and angle, the geometric dimensions of the body and goals. The technique aimed at the effective assimilation of basic knowledge and skills of students on the basics of mathematical modeling, it provides an opportunity to better master the basic theoretical principles of mathematical modeling and related disciplines, promotes logical thinking development of students, their motivation to learn discipline, improves cognitive interest, forms skills research activities than creating conditions for the effective formation of professional competence of future specialists.

  19. Cryptographically Secure Multiparty Computation and Distributed Auctions Using Homomorphic Encryption

    Directory of Open Access Journals (Sweden)

    Anunay Kulshrestha

    2017-12-01

    Full Text Available We introduce a robust framework that allows for cryptographically secure multiparty computations, such as distributed private value auctions. The security is guaranteed by two-sided authentication of all network connections, homomorphically encrypted bids, and the publication of zero-knowledge proofs of every computation. This also allows a non-participant verifier to verify the result of any such computation using only the information broadcasted on the network by each individual bidder. Building on previous work on such systems, we design and implement an extensible framework that puts the described ideas to practice. Apart from the actual implementation of the framework, our biggest contribution is the level of protection we are able to guarantee from attacks described in previous work. In order to provide guidance to users of the library, we analyze the use of zero knowledge proofs in ensuring the correct behavior of each node in a computation. We also describe the usage of the library to perform a private-value distributed auction, as well as the other challenges in implementing the protocol, such as auction registration and certificate distribution. Finally, we provide performance statistics on our implementation of the auction.

  20. CERPI and CEREL, two computer codes for the automatic identification and determination of gamma emitters in thermal-neutron-activated samples

    International Nuclear Information System (INIS)

    Giannini, M.; Oliva, P.R.; Ramorino, M.C.

    1979-01-01

    A computer code that automatically analyzes gamma-ray spectra obtained with Ge(Li) detectors is described. The program contains such features as automatic peak location and fitting, determination of peak energies and intensities, nuclide identification, and calculation of masses and errors. Finally, the results obtained with this computer code for a lunar sample are reported and briefly discussed