WorldWideScience

Sample records for facility integrated computer

  1. Status of the National Ignition Facility Integrated Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Lagin, L; Bryant, R; Carey, R; Casavant, D; Edwards, O; Ferguson, W; Krammen, J; Larson, D; Lee, A; Ludwigsen, P; Miller, M; Moses, E; Nyholm, R; Reed, R; Shelton, R; Van Arsdall, P J; Wuest, C

    2003-10-13

    The National Ignition Facility (NIF), currently under construction at the Lawrence Livermore National Laboratory, is a stadium-sized facility containing a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for nearly 100 experimental diagnostics. When completed, NIF will be the world's largest and most energetic laser experimental system, providing an international center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. NIF's 192 energetic laser beams will compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. Laser hardware is modularized into line replaceable units such as deformable mirrors, amplifiers, and multi-function sensor packages that are operated by the Integrated Computer Control System (ICCS). ICCS is a layered architecture of 300 front-end processors attached to nearly 60,000 control points and coordinated by supervisor subsystems in the main control room. The functional subsystems--beam control including automatic beam alignment and wavefront correction, laser pulse generation and pre-amplification, diagnostics, pulse power, and timing--implement automated shot control, archive data, and support the actions of fourteen operators at graphic consoles. Object-oriented software development uses a mixed language environment of Ada (for functional controls) and Java (for user interface and database backend). The ICCS distributed software framework uses CORBA to communicate between languages and processors. ICCS software is approximately 3/4 complete with over 750 thousand source lines of code having undergone off-line verification tests and deployed to the facility. NIF has entered the first phases of its laser commissioning program. NIF has now demonstrated the highest energy 1{omega}, 2{omega}, and 3{omega} beamlines in the world

  2. Conceptual design of an ALICE Tier-2 centre. Integrated into a multi-purpose computing facility

    Energy Technology Data Exchange (ETDEWEB)

    Zynovyev, Mykhaylo

    2012-06-29

    This thesis discusses the issues and challenges associated with the design and operation of a data analysis facility for a high-energy physics experiment at a multi-purpose computing centre. At the spotlight is a Tier-2 centre of the distributed computing model of the ALICE experiment at the Large Hadron Collider at CERN in Geneva, Switzerland. The design steps, examined in the thesis, include analysis and optimization of the I/O access patterns of the user workload, integration of the storage resources, and development of the techniques for effective system administration and operation of the facility in a shared computing environment. A number of I/O access performance issues on multiple levels of the I/O subsystem, introduced by utilization of hard disks for data storage, have been addressed by the means of exhaustive benchmarking and thorough analysis of the I/O of the user applications in the ALICE software framework. Defining the set of requirements to the storage system, describing the potential performance bottlenecks and single points of failure and examining possible ways to avoid them allows one to develop guidelines for selecting the way how to integrate the storage resources. The solution, how to preserve a specific software stack for the experiment in a shared environment, is presented along with its effects on the user workload performance. The proposal for a flexible model to deploy and operate the ALICE Tier-2 infrastructure and applications in a virtual environment through adoption of the cloud computing technology and the 'Infrastructure as Code' concept completes the thesis. Scientific software applications can be efficiently computed in a virtual environment, and there is an urgent need to adapt the infrastructure for effective usage of cloud resources.

  3. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance Computing The ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  4. Software quality assurance plan for the National Ignition Facility integrated computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project`s controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy`s (DOE`s) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project.

  5. Status Of The National Ignition Campaign And National Ignition Facility Integrated Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Lagin, L; Brunton, G; Carey, R; Demaret, R; Fisher, J; Fishler, B; Ludwigsen, P; Marshall, C; Reed, R; Shelton, R; Townsend, S

    2011-03-18

    The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility that will contains a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for multiple experimental diagnostics. NIF is the world's largest and most energetic laser experimental system, providing a scientific center to study inertial confinement fusion (ICF) and matter at extreme energy densities and pressures. NIF's laser beams are designed to compress fusion targets to conditions required for thermonuclear burn. NIF is operated by the Integrated Computer Control System (ICCS) in an object-oriented, CORBA-based system distributed among over 1800 frontend processors, embedded controllers and supervisory servers. In the fall of 2010, a set of experiments began with deuterium and tritium filled targets as part of the National Ignition Campaign (NIC). At present, all 192 laser beams routinely fire to target chamber center to conduct fusion and high energy density experiments. During the past year, the control system was expanded to include automation of cryogenic target system and over 20 diagnostic systems to support fusion experiments were deployed and utilized in experiments in the past year. This talk discusses the current status of the NIC and the plan for controls and information systems to support these experiments on the path to ignition.

  6. Status of the National Ignition Facility Integrated Computer Control System (ICCS) on the path to ignition

    Energy Technology Data Exchange (ETDEWEB)

    Lagin, L.J. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, CA 94550 (United States)], E-mail: lagin1@llnl.gov; Bettenhausen, R.C.; Bowers, G.A.; Carey, R.W.; Edwards, O.D.; Estes, C.M.; Demaret, R.D.; Ferguson, S.W.; Fisher, J.M.; Ho, J.C.; Ludwigsen, A.P.; Mathisen, D.G.; Marshall, C.D.; Matone, J.T.; McGuigan, D.L.; Sanchez, R.J.; Stout, E.A.; Tekle, E.A.; Townsend, S.L.; Van Arsdall, P.J. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, CA 94550 (United States)] (and others)

    2008-04-15

    The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility under construction that will contain a 192-beam, 1.8-MJ, 500-TW, ultraviolet laser system together with a 10-m diameter target chamber with room for multiple experimental diagnostics. NIF is the world's largest and most energetic laser experimental system, providing a scientific center to study inertial confinement fusion (ICF) and matter at extreme energy densities and pressures. NIF's laser beams are designed to compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. NIF is comprised of 24 independent bundles of eight beams each using laser hardware that is modularized into more than 6000 line replaceable units such as optical assemblies, laser amplifiers, and multi-function sensor packages containing 60,000 control and diagnostic points. NIF is operated by the large-scale Integrated Computer Control System (ICCS) in an architecture partitioned by bundle and distributed among over 800 front-end processors and 50 supervisory servers. NIF's automated control subsystems are built from a common object-oriented software framework based on CORBA distribution that deploys the software across the computer network and achieves interoperation between different languages and target architectures. A shot automation framework has been deployed during the past year to orchestrate and automate shots performed at the NIF using the ICCS. In December 2006, a full cluster of 48 beams of NIF was fired simultaneously, demonstrating that the independent bundle control system will scale to full scale of 192 beams. At present, 72 beams have been commissioned and have demonstrated 1.4-MJ capability of infrared light. During the next 2 years, the control system will be expanded in preparation for project completion in 2009 to include automation of target area systems including

  7. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance ComputingThe ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  8. The National Ignition Facility: Status of the Integrated Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Van Arsdall, P J; Bryant, R; Carey, R; Casavant, D; Demaret, R; Edwards, O; Ferguson, W; Krammen, J; Lagin, L; Larson, D; Lee, A; Ludwigsen, P; Miller, M; Moses, E; Nyholm, R; Reed, R; Shelton, R; Wuest, C

    2003-10-13

    The National Ignition Facility (NIF), currently under construction at the Lawrence Livermore National Laboratory, is a stadium-sized facility containing a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for nearly 100 experimental diagnostics. When completed, NIF will be the world's largest and most energetic laser experimental system, providing an international center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. NIF's 192 energetic laser beams will compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. Laser hardware is modularized into line replaceable units such as deformable mirrors, amplifiers, and multi-function sensor packages that are operated by the Integrated Computer Control System (ICCS). ICCS is a layered architecture of 300 front-end processors attached to nearly 60,000 control points and coordinated by supervisor subsystems in the main control room. The functional subsystems--beam control including automatic beam alignment and wavefront correction, laser pulse generation and pre-amplification, diagnostics, pulse power, and timing--implement automated shot control, archive data, and support the actions of fourteen operators at graphic consoles. Object-oriented software development uses a mixed language environment of Ada (for functional controls) and Java (for user interface and database backend). The ICCS distributed software framework uses CORBA to communicate between languages and processors. ICCS software is approximately three quarters complete with over 750 thousand source lines of code having undergone off-line verification tests and deployed to the facility. NIF has entered the first phases of its laser commissioning program. NIF's highest 3{omega} single laser beam performance is 10.4 kJ, equivalent to 2 MJ

  9. Integrated Disposal Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Located near the center of the 586-square-mile Hanford Site is the Integrated Disposal Facility, also known as the IDF.This facility is a landfill similar in concept...

  10. Assessment of the integrity of structural shielding of four computed tomography facilities in the greater Accra region of Ghana.

    Science.gov (United States)

    Nkansah, A; Schandorf, C; Boadu, M; Fletcher, J J

    2013-08-01

    The structural shielding thicknesses of the walls of four computed tomography (CT) facilities in Ghana were re-evaluated to verify the shielding integrity using the new shielding design methods recommended by the National Council on Radiological Protection and Measurements (NCRP). The shielding thickness obtained ranged from 120 to 155 mm using default DLP values proposed by the European Commission and 110 to 168 mm using derived DLP values from the four CT manufacturers. These values are within the accepted standard concrete wall thickness ranging from 102 to 152 mm prescribed by the NCRP. The ultrasonic pulse testing of all walls indicated that these are of good quality and free of voids since pulse velocities estimated were within the range of 3.496±0.005 km s(-1). An average dose equivalent rate estimated for supervised areas is 3.4±0.27 µSv week(-1) and that for the controlled area is 18.0±0.15 µSv week(-1), which are within acceptable values.

  11. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  12. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  13. Physics Division computer facilities

    Energy Technology Data Exchange (ETDEWEB)

    Cyborski, D.R.; Teh, K.M.

    1995-08-01

    The Physics Division maintains several computer systems for data analysis, general-purpose computing, and word processing. While the VMS VAX clusters are still used, this past year saw a greater shift to the Unix Cluster with the addition of more RISC-based Unix workstations. The main Divisional VAX cluster which consists of two VAX 3300s configured as a dual-host system serves as boot nodes and disk servers to seven other satellite nodes consisting of two VAXstation 3200s, three VAXstation 3100 machines, a VAX-11/750, and a MicroVAX II. There are three 6250/1600 bpi 9-track tape drives, six 8-mm tapes and about 9.1 GB of disk storage served to the cluster by the various satellites. Also, two of the satellites (the MicroVAX and VAX-11/750) have DAPHNE front-end interfaces for data acquisition. Since the tape drives are accessible cluster-wide via a software package, they are, in addition to replay, used for tape-to-tape copies. There is however, a satellite node outfitted with two 8 mm drives available for this purpose. Although not part of the main cluster, a DEC 3000 Alpha machine obtained for data acquisition is also available for data replay. In one case, users reported a performance increase by a factor of 10 when using this machine.

  14. AMRITA -- A computational facility

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, J.E. [California Inst. of Tech., CA (US); Quirk, J.J.

    1998-02-23

    Amrita is a software system for automating numerical investigations. The system is driven using its own powerful scripting language, Amrita, which facilitates both the composition and archiving of complete numerical investigations, as distinct from isolated computations. Once archived, an Amrita investigation can later be reproduced by any interested party, and not just the original investigator, for no cost other than the raw CPU time needed to parse the archived script. In fact, this entire lecture can be reconstructed in such a fashion. To do this, the script: constructs a number of shock-capturing schemes; runs a series of test problems, generates the plots shown; outputs the LATEX to typeset the notes; performs a myriad of behind-the-scenes tasks to glue everything together. Thus Amrita has all the characteristics of an operating system and should not be mistaken for a common-or-garden code.

  15. The Integral Test Facility Karlstein

    Directory of Open Access Journals (Sweden)

    Stephan Leyer

    2012-01-01

    Full Text Available The Integral Test Facility Karlstein (INKA test facility was designed and erected to test the performance of the passive safety systems of KERENA, the new AREVA Boiling Water Reactor design. The experimental program included single component/system tests of the Emergency Condenser, the Containment Cooling Condenser and the Passive Core Flooding System. Integral system tests, including also the Passive Pressure Pulse Transmitter, will be performed to simulate transients and Loss of Coolant Accident scenarios at the test facility. The INKA test facility represents the KERENA Containment with a volume scaling of 1 : 24. Component heights and levels are in full scale. The reactor pressure vessel is simulated by the accumulator vessel of the large valve test facility of Karlstein—a vessel with a design pressure of 11 MPa and a storage capacity of 125 m3. The vessel is fed by a benson boiler with a maximum power supply of 22 MW. The INKA multi compartment pressure suppression Containment meets the requirements of modern and existing BWR designs. As a result of the large power supply at the facility, INKA is capable of simulating various accident scenarios, including a full train of passive systems, starting with the initiating event—for example pipe rupture.

  16. Integrated Facilities and Infrastructure Plan.

    Energy Technology Data Exchange (ETDEWEB)

    Reisz Westlund, Jennifer Jill

    2017-03-01

    Our facilities and infrastructure are a key element of our capability-based science and engineering foundation. The focus of the Integrated Facilities and Infrastructure Plan is the development and implementation of a comprehensive plan to sustain the capabilities necessary to meet national research, design, and fabrication needs for Sandia National Laboratories’ (Sandia’s) comprehensive national security missions both now and into the future. A number of Sandia’s facilities have reached the end of their useful lives and many others are not suitable for today’s mission needs. Due to the continued aging and surge in utilization of Sandia’s facilities, deferred maintenance has continued to increase. As part of our planning focus, Sandia is committed to halting the growth of deferred maintenance across its sites through demolition, replacement, and dedicated funding to reduce the backlog of maintenance needs. Sandia will become more agile in adapting existing space and changing how space is utilized in response to the changing requirements. This Integrated Facilities & Infrastructure (F&I) Plan supports the Sandia Strategic Plan’s strategic objectives, specifically Strategic Objective 2: Strengthen our Laboratories’ foundation to maximize mission impact, and Strategic Objective 3: Advance an exceptional work environment that enables and inspires our people in service to our nation. The Integrated F&I Plan is developed through a planning process model to understand the F&I needs, analyze solution options, plan the actions and funding, and then execute projects.

  17. Integrated Disposal Facility Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    MANN, F. M.

    2003-06-03

    An environmental risk assessment associated with the disposal of projected Immobilized Low-Activity Waste, solid wastes and failed or decommissioned melters in an Integrated Disposal Facility was performed. Based on the analyses all performance objectives associated with the groundwater, air, and intruder pathways were met.

  18. DKIST facility management system integration

    Science.gov (United States)

    White, Charles R.; Phelps, LeEllen

    2016-07-01

    The Daniel K. Inouye Solar Telescope (DKIST) Observatory is under construction at Haleakalā, Maui, Hawai'i. When complete, the DKIST will be the largest solar telescope in the world. The Facility Management System (FMS) is a subsystem of the high-level Facility Control System (FCS) and directly controls the Facility Thermal System (FTS). The FMS receives operational mode information from the FCS while making process data available to the FCS and includes hardware and software to integrate and control all aspects of the FTS including the Carousel Cooling System, the Telescope Chamber Environmental Control Systems, and the Temperature Monitoring System. In addition it will integrate the Power Energy Management System and several service systems such as heating, ventilation, and air conditioning (HVAC), the Domestic Water Distribution System, and the Vacuum System. All of these subsystems must operate in coordination to provide the best possible observing conditions and overall building management. Further, the FMS must actively react to varying weather conditions and observational requirements. The physical impact of the facility must not interfere with neighboring installations while operating in a very environmentally and culturally sensitive area. The FMS system will be comprised of five Programmable Automation Controllers (PACs). We present a pre-build overview of the functional plan to integrate all of the FMS subsystems.

  19. Design Integration of Facilities Management

    DEFF Research Database (Denmark)

    Jensen, Per Anker

    2009-01-01

    . The paper identifies the aspects of FM that should be considered during the different stages of design. A typology of knowledge transfer from building operation to building design is presented based on a combination of knowledge push from building operation and knowledge pull from building design....... Strategies, methods and barriers for the transfer and integration of operational knowledge into the design process are discussed. Multiple strategies are needed to improve the integration of FM in design. Building clients must take on a leading role in defining and setting up requirements and procedures....... Involvement of professional facilities managers in the design process is an obvious strategy, but increased competences are needed among building clients, designers and the operational staff. More codification of operational knowledge is also needed, for instance in IT systems. The paper is based...

  20. Desktop Computing Integration Project

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1992-01-01

    The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.

  1. 2014 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  2. 2015 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  3. Integrating ventilation monitoring sensor data with ventilation computer simulation software at the Waste Isolation Pilot Plant facility

    Energy Technology Data Exchange (ETDEWEB)

    Ruckman, R.; Prosser, B. [Mine Ventilation Services Inc., Clovis, CA (United States)

    2010-07-01

    This paper described an on-going ventilation study at an underground nuclear waste repository located in a bedded salt deposit in New Mexico. Underground airflow, differential pressure, primary fan information, and psychometric monitors were integrated into a ventilation model for the Waste Isolation Pilot Plant (WIPP). The WIPPVENT ventilation software is based on the commercially available package VnetPC developed by Mine Ventilation Services Inc. The ventilation system at WIPP has been tested and balanced since 1988. The work has involved re-engineering some of the ventilation system components in order to mitigate the effects of natural ventilation pressures. Ventilation monitoring system were also installed and remote control of the main underground regulators was achieved. Modifications were also made to the VnetPC ventilation software package to allow for continuous real-time updated ventilation models from the field measurement stations. This paper described the modifications to incorporate the real-time sensor data into the WIPPVENT program. 6 refs., 7 figs.

  4. INTEGRITY -- Integrated Human Exploration Mission Simulation Facility

    Science.gov (United States)

    Henninger, D.; Tri, T.; Daues, K.

    It is proposed to develop a high -fidelity ground facil ity to carry out long-duration human exploration mission simulations. These would not be merely computer simulations - they would in fact comprise a series of actual missions that just happen to stay on earth. These missions would include all elements of an actual mission, using actual technologies that would be used for the real mission. These missions would also include such elements as extravehicular activities, robotic systems, telepresence and teleoperation, surface drilling technology--all using a simulated planetary landscape. A sequence of missions would be defined that get progressively longer and more robust, perhaps a series of five or six missions over a span of 10 to 15 years ranging in durat ion from 180 days up to 1000 days. This high-fidelity ground facility would operate hand-in-hand with a host of other terrestrial analog sites such as the Antarctic, Haughton Crater, and the Arizona desert. Of course, all of these analog mission simulations will be conducted here on earth in 1-g, and NASA will still need the Shuttle and ISS to carry out all the microgravity and hypogravity science experiments and technology validations. The proposed missions would have sufficient definition such that definitive requirements could be derived from them to serve as direction for all the program elements of the mission. Additionally, specific milestones would be established for the "launch" date of each mission so that R&D programs would have both good requirements and solid milestones from which to build their implementation plans. Mission aspects that could not be directly incorporated into the ground facility would be simulated via software. New management techniques would be developed for evaluation in this ground test facility program. These new techniques would have embedded metrics which would allow them to be continuously evaluated and adjusted so that by the time the sequence of missions is completed

  5. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  6. Oak Ridge Leadership Computing Facility (OLCF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Oak Ridge Leadership Computing Facility (OLCF) was established at Oak Ridge National Laboratory in 2004 with the mission of standing up a supercomputer 100 times...

  7. Rendezvous Facilities in a Distributed Computer System

    Institute of Scientific and Technical Information of China (English)

    廖先Zhi; 金兰

    1995-01-01

    The distributed computer system described in this paper is a set of computer nodes interconnected in an interconnection network via packet-switching interfaces.The nodes communicate with each other by means of message-passing protocols.This paper presents the implementation of rendezvous facilities as high-level primitives provided by a parallel programming language to support interprocess communication and synchronization.

  8. 2016 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, Jim [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-01-01

    The Argonne Leadership Computing Facility (ALCF) helps researchers solve some of the world’s largest and most complex problems, while also advancing the nation’s efforts to develop future exascale computing systems. This report presents some of the ALCF’s notable achievements in key strategic areas over the past year.

  9. Integral lightning protection system in petroleum facilities

    Energy Technology Data Exchange (ETDEWEB)

    Torres, Horacio; Gallego, Luis; Montana, Johny; Younes, Camilo; Rondon, Daniel; Gonzalez, Diego; Herrera, Javier; Perez, Ernesto; Vargas, Mauricio; Quintana, Carlos; Salgado, Milton [Universidad Nacional de Colombia, Bogota (Colombia)]. E-mail: paas@paas.unal.edu.co

    2001-07-01

    This paper presents an Integral Lightning Protection System, focused mainly in petroleum facilities and applied to a real case in Colombia, South America. As introduction it is presented a summary of the incidents happened in last years, a diagnosis and the proposal of solution. Finally, as part of the analysis, a lightning risk assessment for the Central Process Facility is showed. (author)

  10. Design Integration of Facilities Management

    DEFF Research Database (Denmark)

    Jensen, Per Anker

    2009-01-01

    . The paper identifies the aspects of FM that should be considered during the different stages of design. A typology of knowledge transfer from building operation to building design is presented based on a combination of knowledge push from building operation and knowledge pull from building design....... Strategies, methods and barriers for the transfer and integration of operational knowledge into the design process are discussed. Multiple strategies are needed to improve the integration of FM in design. Building clients must take on a leading role in defining and setting up requirements and procedures...

  11. NIF Integrated Computer Controls System Description

    Energy Technology Data Exchange (ETDEWEB)

    VanArsdall, P.

    1998-01-26

    This System Description introduces the NIF Integrated Computer Control System (ICCS). The architecture is sufficiently abstract to allow the construction of many similar applications from a common framework. As discussed below, over twenty software applications derived from the framework comprise the NIF control system. This document lays the essential foundation for understanding the ICCS architecture. The NIF design effort is motivated by the magnitude of the task. Figure 1 shows a cut-away rendition of the coliseum-sized facility. The NIF requires integration of about 40,000 atypical control points, must be highly automated and robust, and will operate continuously around the clock. The control system coordinates several experimental cycles concurrently, each at different stages of completion. Furthermore, facilities such as the NIF represent major capital investments that will be operated, maintained, and upgraded for decades. The computers, control subsystems, and functionality must be relatively easy to extend or replace periodically with newer technology.

  12. NIF Integrated Computer Controls System Description

    Energy Technology Data Exchange (ETDEWEB)

    VanArsdall, P.

    1998-01-26

    This System Description introduces the NIF Integrated Computer Control System (ICCS). The architecture is sufficiently abstract to allow the construction of many similar applications from a common framework. As discussed below, over twenty software applications derived from the framework comprise the NIF control system. This document lays the essential foundation for understanding the ICCS architecture. The NIF design effort is motivated by the magnitude of the task. Figure 1 shows a cut-away rendition of the coliseum-sized facility. The NIF requires integration of about 40,000 atypical control points, must be highly automated and robust, and will operate continuously around the clock. The control system coordinates several experimental cycles concurrently, each at different stages of completion. Furthermore, facilities such as the NIF represent major capital investments that will be operated, maintained, and upgraded for decades. The computers, control subsystems, and functionality must be relatively easy to extend or replace periodically with newer technology.

  13. Integrated safeguards and facility design and operations

    Energy Technology Data Exchange (ETDEWEB)

    Tape, J.W.; Coulter, C.A.; Markin, J.T.; Thomas, K.E.

    1987-01-01

    The integration of safeguards functions to deter or detect unauthorized actions by insiders requires careful communication and management of safeguards-relevant information on a timely basis. The separation of safeguards functions into physical protection, materials control, and materials accounting often inhibits important information flows. Redefining the major safeguards functions as authorization, enforcement, and verification and careful attention to management of information can result in effective safeguards integration. Whether designing new systems or analyzing existing ones, understanding the interface between facility operations and safeguards is critical to cost-effective integrated safeguards systems that meet modern standards of performance.

  14. Molecular Science Computing Facility Scientific Challenges: Linking Across Scales

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.; Windus, Theresa L.

    2005-07-01

    The purpose of this document is to define the evolving science drivers for performing environmental molecular research at the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and to provide guidance associated with the next-generation high-performance computing center that must be developed at EMSL's Molecular Science Computing Facility (MSCF) in order to address this critical research. The MSCF is the pre-eminent computing facility?supported by the U.S. Department of Energy's (DOE's) Office of Biological and Environmental Research (BER)?tailored to provide the fastest time-to-solution for current computational challenges in chemistry and biology, as well as providing the means for broad research in the molecular and environmental sciences. The MSCF provides integral resources and expertise to emerging EMSL Scientific Grand Challenges and Collaborative Access Teams that are designed to leverage the multiple integrated research capabilities of EMSL, thereby creating a synergy between computation and experiment to address environmental molecular science challenges critical to DOE and the nation.

  15. Oak Ridge Leadership Computing Facility Position Paper

    Energy Technology Data Exchange (ETDEWEB)

    Oral, H Sarp [ORNL; Hill, Jason J [ORNL; Thach, Kevin G [ORNL; Podhorszki, Norbert [ORNL; Klasky, Scott A [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL

    2011-01-01

    This paper discusses the business, administration, reliability, and usability aspects of storage systems at the Oak Ridge Leadership Computing Facility (OLCF). The OLCF has developed key competencies in architecting and administration of large-scale Lustre deployments as well as HPSS archival systems. Additionally as these systems are architected, deployed, and expanded over time reliability and availability factors are a primary driver. This paper focuses on the implementation of the Spider parallel Lustre file system as well as the implementation of the HPSS archive at the OLCF.

  16. Computational evaluation oa a neutron field facility

    Energy Technology Data Exchange (ETDEWEB)

    Pinto, Jose Julio de O.; Pazianotto, Mauricio T., E-mail: jjfilos@hotmail.com, E-mail: mpazianotto@gmail.com [Instituto Tecnologico de Aeronautica (ITA/DCTA), Sao Jose dos Campos, SP (Brazil); Federico, Claudio A.; Passaro, Angelo, E-mail: claudiofederico@ieav.cta.br, E-mail: angelo@ieav.cta.br [Instituto de Estudos Avancados (IEAv/DCTA), Sao Jose dos Campos, SP (Brazil)

    2015-07-01

    This paper describes the results of a study based on computer simulation for a realistic 3D model of Ionizing Radiation Laboratory of the Institute for Advanced Studies (IEAv) using the MCNP5 (Monte Carlo N-Particle) code, in order to guide the installing a neutron generator, produced by reaction {sup 3}H(d,n){sup 4}He. The equipment produces neutrons with energy of 14.1 MeV and 2 x 10{sup 8} n/s production rate in 4 πgeometry, which can also be used for neutron dosimetry studies. This work evaluated the spectra and neutron fluence provided on previously selected positions inside the facility, chosen due to the interest to evaluate the assessment of ambient dose equivalent so that they can be made the necessary adjustments to the installation to be consistent with the guidelines of radiation protection and radiation safety, determined by the standards of National Nuclear Energy Commission (CNEN). (author)

  17. Carbon dioxide neutral, integrated biofuel facility

    Energy Technology Data Exchange (ETDEWEB)

    Powell, E.E.; Hill, G.A. [Department of Chemical Engineering, University of Saskatchewan, 57 Campus Drive, Saskatoon, Saskatchewan, S7N 5A9 (Canada)

    2010-12-15

    Algae are efficient biocatalysts for both capture and conversion of carbon dioxide in the environment. In earlier work, we have optimized the ability of Chlorella vulgaris to rapidly capture CO{sub 2} from man-made emission sources by varying environmental growth conditions and bioreactor design. Here we demonstrate that a coupled biodiesel-bioethanol facility, using yeast to produce ethanol and photosynthetic algae to produce biodiesel, can result in an integrated, economical, large-scale process for biofuel production. Each bioreactor acts as an electrode for a coupled complete microbial fuel cell system; the integrated cultures produce electricity that is consumed as an energy source within the process. Finally, both the produced yeast and spent algae biomass can be used as added value byproducts in the feed or food industries. Using cost and revenue estimations, an IRR of up to 25% is calculated using a 5 year project lifespan. (author)

  18. Integrating Computational Chemistry into the Physical Chemistry Curriculum

    Science.gov (United States)

    Johnson, Lewis E.; Engel, Thomas

    2011-01-01

    Relatively few undergraduate physical chemistry programs integrate molecular modeling into their quantum mechanics curriculum owing to concerns about limited access to computational facilities, the cost of software, and concerns about increasing the course material. However, modeling exercises can be integrated into an undergraduate course at a…

  19. Integrating Computational Chemistry into the Physical Chemistry Curriculum

    Science.gov (United States)

    Johnson, Lewis E.; Engel, Thomas

    2011-01-01

    Relatively few undergraduate physical chemistry programs integrate molecular modeling into their quantum mechanics curriculum owing to concerns about limited access to computational facilities, the cost of software, and concerns about increasing the course material. However, modeling exercises can be integrated into an undergraduate course at a…

  20. Shielding Calculations for Positron Emission Tomography - Computed Tomography Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Baasandorj, Khashbayar [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Yang, Jeongseon [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-10-15

    Integrated PET-CT has been shown to be more accurate for lesion localization and characterization than PET or CT alone, and the results obtained from PET and CT separately and interpreted side by side or following software based fusion of the PET and CT datasets. At the same time, PET-CT scans can result in high patient and staff doses; therefore, careful site planning and shielding of this imaging modality have become challenging issues in the field. In Mongolia, the introduction of PET-CT facilities is currently being considered in many hospitals. Thus, additional regulatory legislation for nuclear and radiation applications is necessary, for example, in regulating licensee processes and ensuring radiation safety during the operations. This paper aims to determine appropriate PET-CT shielding designs using numerical formulas and computer code. Since presently there are no PET-CT facilities in Mongolia, contact was made with radiological staff at the Nuclear Medicine Center of the National Cancer Center of Mongolia (NCCM) to get information about facilities where the introduction of PET-CT is being considered. Well-designed facilities do not require additional shielding, which should help cut down overall costs related to PET-CT installation. According to the results of this study, building barrier thicknesses of the NCCM building is not sufficient to keep radiation dose within the limits.

  1. Integrated computer-aided design using minicomputers

    Science.gov (United States)

    Storaasli, O. O.

    1980-01-01

    Computer-Aided Design/Computer-Aided Manufacturing (CAD/CAM), a highly interactive software, has been implemented on minicomputers at the NASA Langley Research Center. CAD/CAM software integrates many formerly fragmented programs and procedures into one cohesive system; it also includes finite element modeling and analysis, and has been interfaced via a computer network to a relational data base management system and offline plotting devices on mainframe computers. The CAD/CAM software system requires interactive graphics terminals operating at a minimum of 4800 bits/sec transfer rate to a computer. The system is portable and introduces 'interactive graphics', which permits the creation and modification of models interactively. The CAD/CAM system has already produced designs for a large area space platform, a national transonic facility fan blade, and a laminar flow control wind tunnel model. Besides the design/drafting element analysis capability, CAD/CAM provides options to produce an automatic program tooling code to drive a numerically controlled (N/C) machine. Reductions in time for design, engineering, drawing, finite element modeling, and N/C machining will benefit productivity through reduced costs, fewer errors, and a wider range of configuration.

  2. Computer Profile of School Facilities Energy Consumption.

    Science.gov (United States)

    Oswalt, Felix E.

    This document outlines a computerized management tool designed to enable building managers to identify energy consumption as related to types and uses of school facilities for the purpose of evaluating and managing the operation, maintenance, modification, and planning of new facilities. Specifically, it is expected that the statistics generated…

  3. ASCR Cybersecurity for Scientific Computing Integrity

    Energy Technology Data Exchange (ETDEWEB)

    Piesert, Sean

    2015-02-27

    The Department of Energy (DOE) has the responsibility to address the energy, environmental, and nuclear security challenges that face our nation. Much of DOE’s enterprise involves distributed, collaborative teams; a signi¬cant fraction involves “open science,” which depends on multi-institutional, often international collaborations that must access or share signi¬cant amounts of information between institutions and over networks around the world. The mission of the Office of Science is the delivery of scienti¬c discoveries and major scienti¬c tools to transform our understanding of nature and to advance the energy, economic, and national security of the United States. The ability of DOE to execute its responsibilities depends critically on its ability to assure the integrity and availability of scienti¬c facilities and computer systems, and of the scienti¬c, engineering, and operational software and data that support its mission.

  4. High Performance Computing Facility Operational Assessment, CY 2011 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Ann E [ORNL; Barker, Ashley D [ORNL; Bland, Arthur S Buddy [ORNL; Boudwin, Kathlyn J. [ORNL; Hack, James J [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; Wells, Jack C [ORNL; White, Julia C [ORNL; Hudson, Douglas L [ORNL

    2012-02-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.4 billion core hours in calendar year (CY) 2011 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Users reported more than 670 publications this year arising from their use of OLCF resources. Of these we report the 300 in this review that are consistent with guidance provided. Scientific achievements by OLCF users cut across all range scales from atomic to molecular to large-scale structures. At the atomic scale, researchers discovered that the anomalously long half-life of Carbon-14 can be explained by calculating, for the first time, the very complex three-body interactions between all the neutrons and protons in the nucleus. At the molecular scale, researchers combined experimental results from LBL's light source and simulations on Jaguar to discover how DNA replication continues past a damaged site so a mutation can be repaired later. Other researchers combined experimental results from ORNL's Spallation Neutron Source and simulations on Jaguar to reveal the molecular structure of ligno-cellulosic material used in bioethanol production. This year, Jaguar has been used to do billion-cell CFD calculations to develop shock wave compression turbo machinery as a means to meet DOE goals for reducing carbon sequestration costs. General Electric used Jaguar to calculate the unsteady flow through turbo machinery to learn what efficiencies the traditional steady flow assumption is hiding from designers. Even a 1% improvement in turbine design can save the nation

  5. PANDA: A Multipurpose Integral Test Facility for LWR Safety Investigations

    OpenAIRE

    2012-01-01

    The PANDA facility is a large scale, multicompartmental thermal hydraulic facility suited for investigations related to the safety of current and advanced LWRs. The facility is multipurpose, and the applications cover integral containment response tests, component tests, primary system tests, and separate effect tests. Experimental investigations carried on in the PANDA facility have been embedded in international projects, most of which under the auspices of the EU and OECD and with the supp...

  6. High Performance Computing Facility Operational Assessment 2015: Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Barker, Ashley D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Bernholdt, David E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Bland, Arthur S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Gary, Jeff D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Hack, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; McNally, Stephen T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Rogers, James H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Smith, Brian E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Straatsma, T. P. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Sukumar, Sreenivas Rangan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Thach, Kevin G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Tichenor, Suzy [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Vazhkudai, Sudharshan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Wells, Jack C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility

    2016-03-01

    Oak Ridge National Laboratory’s (ORNL’s) Leadership Computing Facility (OLCF) continues to surpass its operational target goals: supporting users; delivering fast, reliable systems; creating innovative solutions for high-performance computing (HPC) needs; and managing risks, safety, and security aspects associated with operating one of the most powerful computers in the world. The results can be seen in the cutting-edge science delivered by users and the praise from the research community. Calendar year (CY) 2015 was filled with outstanding operational results and accomplishments: a very high rating from users on overall satisfaction that ties the highest-ever mark set in CY 2014; the greatest number of core-hours delivered to research projects; the largest percentage of capability usage since the OLCF began tracking the metric in 2009; and success in delivering on the allocation of 60, 30, and 10% of core hours offered for the INCITE (Innovative and Novel Computational Impact on Theory and Experiment), ALCC (Advanced Scientific Computing Research Leadership Computing Challenge), and Director’s Discretionary programs, respectively. These accomplishments, coupled with the extremely high utilization rate, represent the fulfillment of the promise of Titan: maximum use by maximum-size simulations. The impact of all of these successes and more is reflected in the accomplishments of OLCF users, with publications this year in notable journals Nature, Nature Materials, Nature Chemistry, Nature Physics, Nature Climate Change, ACS Nano, Journal of the American Chemical Society, and Physical Review Letters, as well as many others. The achievements included in the 2015 OLCF Operational Assessment Report reflect first-ever or largest simulations in their communities; for example Titan enabled engineers in Los Angeles and the surrounding region to design and begin building improved critical infrastructure by enabling the highest-resolution Cybershake map for Southern

  7. Experimental computation with oscillatory integrals

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.; Borwein, Jonathan M.

    2009-06-26

    A previous study by one of the present authors, together with D. Borwein and I. Leonard [8], studied the asymptotic behavior of the p-norm of the sinc function: sinc(x) = (sin x)/x and along the way looked at closed forms for integer values of p. In this study we address these integrals with the tools of experimental mathematics, namely by computing their numerical values to high precision, both as a challenge in itself, and also in an attempt to recognize the numerical values as closed-form constants. With this approach, we are able to reproduce several of the results of [8] and to find new results, both numeric and analytic, that go beyond the previous study.

  8. High Performance Computing Facility Operational Assessment, FY 2010 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Bland, Arthur S Buddy [ORNL; Hack, James J [ORNL; Baker, Ann E [ORNL; Barker, Ashley D [ORNL; Boudwin, Kathlyn J. [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; White, Julia C [ORNL

    2010-08-01

    Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energy assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools

  9. Integrated quadratic assignment and continuous facility layout problem

    Directory of Open Access Journals (Sweden)

    Mohammad Mohammadi

    2012-10-01

    Full Text Available In this paper, an integrated layout model has been considered to incorporate intra and inter-department layout. In the proposed model, the arrangement of facilities within the departments is obtained through the QAP and from the other side the continuous layout problem is implemented to find the position and orientation of rectangular shape departments on the planar area. First, a modified version of QAP with fewer binary variables is presented. Afterward the integrated model is formulated based on the developed QAP. In order to evaluate material handling cost precisely, the actual position of machines within the departments (instead of center of departments is considered. Moreover, other design factors such as aisle distance, single or multi row intra-department layout and orientation of departments have been considered. The mathematical model is formulated as mixed-integer programming (MIP to minimize total material handling cost. Also due to the complexity of integrated model a heuristic method has been developed to solve large scale problems in a reasonable computational time. Finally, several illustrative numerical examples are selected from the literature to test the model and evaluate the heuristic.

  10. Office of Chief Scientist, Integrated Research Facility (OCSIRF)

    Data.gov (United States)

    Federal Laboratory Consortium — Introduction The Integrated Research Facility (IRF) is part of the Office of the Chief Scientist (OCS) for the Division of Clinical Research in the NIAID Office of...

  11. High Performance Computing Facility Operational Assessment, FY 2011 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Ann E [ORNL; Bland, Arthur S Buddy [ORNL; Hack, James J [ORNL; Barker, Ashley D [ORNL; Boudwin, Kathlyn J. [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; Wells, Jack C [ORNL; White, Julia C [ORNL

    2011-08-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.5 billion core hours in calendar year (CY) 2010 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Scientific achievements by OLCF users range from collaboration with university experimentalists to produce a working supercapacitor that uses atom-thick sheets of carbon materials to finely determining the resolution requirements for simulations of coal gasifiers and their components, thus laying the foundation for development of commercial-scale gasifiers. OLCF users are pushing the boundaries with software applications sustaining more than one petaflop of performance in the quest to illuminate the fundamental nature of electronic devices. Other teams of researchers are working to resolve predictive capabilities of climate models, to refine and validate genome sequencing, and to explore the most fundamental materials in nature - quarks and gluons - and their unique properties. Details of these scientific endeavors - not possible without access to leadership-class computing resources - are detailed in Section 4 of this report and in the INCITE in Review. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. This Operational Assessment Report (OAR) will delineate the policies, procedures, and innovations implemented by the OLCF to continue delivering a petaflop-scale resource for cutting-edge research. The 2010 operational assessment of the OLCF yielded recommendations that have been addressed (Reference Section 1) and

  12. A digital computer propulsion control facility: Description of capabilities and summary of experimental program results

    Science.gov (United States)

    Zeller, J. R.; Arpasi, D. J.; Lehtinen, B.

    1976-01-01

    Flight weight digital computers are being used today to carry out many of the propulsion system control functions previously delegated exclusively to hydromechanical controllers. An operational digital computer facility for propulsion control mode studies has been used successfully in several experimental programs. This paper describes the system and some of the results concerned with engine control, inlet control, and inlet engine integrated control. Analytical designs for the digital propulsion control modes include both classical and modern/optimal techniques.

  13. An integrated lean-methods approach to hospital facilities redesign.

    Science.gov (United States)

    Nicholas, John

    2012-01-01

    Lean production methods for eliminating waste and improving processes in manufacturing are now being applied in healthcare. As the author shows, the methods are appropriate for redesigning hospital facilities. When used in an integrated manner and employing teams of mostly clinicians, the methods produce facility designs that are custom-fit to patient needs and caregiver work processes, and reduce operational costs. The author reviews lean methods and an approach for integrating them in the redesign of hospital facilities. A case example of the redesign of an emergency department shows the feasibility and benefits of the approach.

  14. Integrated Computer System of Management in Logistics

    Science.gov (United States)

    Chwesiuk, Krzysztof

    2011-06-01

    This paper aims at presenting a concept of an integrated computer system of management in logistics, particularly in supply and distribution chains. Consequently, the paper includes the basic idea of the concept of computer-based management in logistics and components of the system, such as CAM and CIM systems in production processes, and management systems for storage, materials flow, and for managing transport, forwarding and logistics companies. The platform which integrates computer-aided management systems is that of electronic data interchange.

  15. Posterior moments computed by mixed integration

    NARCIS (Netherlands)

    H.K. van Dijk (Herman); T. Kloek (Teun); C.G.E. Boender

    1985-01-01

    textabstractA flexible numerical integration method is proposed for the computation of moments of a multivariate posterior density with different tail properties in different directions. The method (called mixed integration) amounts to a combination of classical numerical integration and Monte Carlo

  16. Computer/information security design approaches for Complex 21/Reconfiguration facilities

    Energy Technology Data Exchange (ETDEWEB)

    Hunteman, W.J.; Zack, N.R. [Los Alamos National Lab., NM (United States). Safeguards Systems Group; Jaeger, C.D. [Sandia National Labs., Albuquerque, NM (United States). Surety/Dismantlement Dept.

    1993-12-31

    Los Alamos National Laboratory and Sandia National Laboratories have been designated the technical lead laboratories to develop the design of the computer/information security, safeguards, and physical security systems for all of the DOE Complex 21/Reconfiguration facilities. All of the automated information processing systems and networks in these facilities will be required to implement the new DOE orders on computer and information security. The planned approach for a highly integrated information processing capability in each of the facilities will require careful consideration of the requirements in DOE Orders 5639.6 and 1360.2A. The various information protection requirements and user clearances within the facilities will also have a significant effect on the design of the systems and networks. Fulfilling the requirements for proper protection of the information and compliance with DOE orders will be possible because the computer and information security concerns are being incorporated in the early design activities. This paper will discuss the computer and information security issues being addressed in the integrated design effort for the tritium, uranium/lithium, plutonium, plutonium storage, and high explosive/assembly facilities.

  17. Computer/information security design approaches for Complex 21/Reconfiguration facilities

    Energy Technology Data Exchange (ETDEWEB)

    Hunteman, W.J.; Zack, N.R. [Los Alamos National Lab., NM (United States); Jaeger, C.D. [Sandia National Labs., Albuquerque, NM (United States)

    1993-08-01

    Los Alamos National Laboratory and Sandia National Laboratories have been designated the technical lead laboratories to develop the design of the computer/information security, safeguards, and physical security systems for all of the DOE Complex 21/Reconfiguration facilities. All of the automated information processing systems and networks in these facilities will be required to implement the new DOE orders on computer and information security. The planned approach for a highly integrated information processing capability in each of the facilities will require careful consideration of the requirements in DOE Orders 5639.6 and 1360.2A. The various information protection requirements and user clearances within the facilities will also have a significant effect on the design of the systems and networks. Fulfilling the requirements for proper protection of the information and compliance with DOE orders will be possible because the computer and information security concerns are being incorporated in the early design activities. This paper will discuss the computer and information security addressed in the integrated design effort, uranium/lithium, plutonium, plutonium high explosive/assembly facilities.

  18. ATLAS Experience with HEP Software at the Argonne Leadership Computing Facility

    CERN Document Server

    LeCompte, T; The ATLAS collaboration; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  19. Facility Requirements for Integrated Learning Systems.

    Science.gov (United States)

    Knirk, Frederick G.

    1992-01-01

    Discusses features in the physical environment that need to be considered for integrated learning systems (ILSs). Highlights include ergonomics; lighting, including contrast and colors; space, furniture, and equipment, including keyboard, monitor, software, and printer; ambient noise and acoustics; temperature, humidity, and air quality control;…

  20. Integrated Disposal Facility FY2010 Glass Testing Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Pierce, Eric M.; Bacon, Diana H.; Kerisit, Sebastien N.; Windisch, Charles F.; Cantrell, Kirk J.; Valenta, Michelle M.; Burton, Sarah D.; Serne, R Jeffrey; Mattigod, Shas V.

    2010-09-30

    Pacific Northwest National Laboratory was contracted by Washington River Protection Solutions, LLC to provide the technical basis for estimating radionuclide release from the engineered portion of the disposal facility (e.g., source term). Vitrifying the low-activity waste at Hanford is expected to generate over 1.6 × 105 m3 of glass (Puigh 1999). The volume of immobilized low-activity waste (ILAW) at Hanford is the largest in the DOE complex and is one of the largest inventories (approximately 0.89 × 1018 Bq total activity) of long-lived radionuclides, principally 99Tc (t1/2 = 2.1 × 105), planned for disposal in a low-level waste (LLW) facility. Before the ILAW can be disposed, DOE must conduct a performance assessement (PA) for the Integrated Disposal Facility (IDF) that describes the long-term impacts of the disposal facility on public health and environmental resources. As part of the ILAW glass testing program PNNL is implementing a strategy, consisting of experimentation and modeling, in order to provide the technical basis for estimating radionuclide release from the glass waste form in support of future IDF PAs. The purpose of this report is to summarize the progress made in fiscal year (FY) 2010 toward implementing the strategy with the goal of developing an understanding of the long-term corrosion behavior of low-activity waste glasses. The emphasis in FY2010 was the completing an evaluation of the most sensitive kinetic rate law parameters used to predict glass weathering, documented in Bacon and Pierce (2010), and transitioning from the use of the Subsurface Transport Over Reactive Multi-phases to Subsurface Transport Over Multiple Phases computer code for near-field calculations. The FY2010 activities also consisted of developing a Monte Carlo and Geochemical Modeling framework that links glass composition to alteration phase formation by 1) determining the structure of unreacted and reacted glasses for use as input information into Monte Carlo

  1. Computational Modeling in Support of National Ignition Facility Operations

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, M J; Sacks, R A; Haynam, C A; Williams, W H

    2001-10-23

    Numerical simulation of the National Ignition Facility (NIF) laser performance and automated control of laser setup process are crucial to the project's success. These functions will be performed by two closely coupled computer codes: the virtual beamline (VBL) and the laser operations performance model (LPOM).

  2. PANDA: A Multipurpose Integral Test Facility for LWR Safety Investigations

    Directory of Open Access Journals (Sweden)

    Domenico Paladino

    2012-01-01

    Full Text Available The PANDA facility is a large scale, multicompartmental thermal hydraulic facility suited for investigations related to the safety of current and advanced LWRs. The facility is multipurpose, and the applications cover integral containment response tests, component tests, primary system tests, and separate effect tests. Experimental investigations carried on in the PANDA facility have been embedded in international projects, most of which under the auspices of the EU and OECD and with the support of a large number of organizations (regulatory bodies, technical dupport organizations, national laboratories, electric utilities, industries worldwide. The paper provides an overview of the research programs performed in the PANDA facility in relation to BWR containment systems and those planned for PWR containment systems.

  3. Computer usage among nurses in rural health-care facilities in South Africa: obstacles and challenges.

    Science.gov (United States)

    Asah, Flora

    2013-04-01

    This study discusses factors inhibiting computer usage for work-related tasks among computer-literate professional nurses within rural healthcare facilities in South Africa. In the past two decades computer literacy courses have not been part of the nursing curricula. Computer courses are offered by the State Information Technology Agency. Despite this, there seems to be limited use of computers by professional nurses in the rural context. Focus group interviews held with 40 professional nurses from three government hospitals in northern KwaZulu-Natal. Contributing factors were found to be lack of information technology infrastructure, restricted access to computers and deficits in regard to the technical and nursing management support. The physical location of computers within the health-care facilities and lack of relevant software emerged as specific obstacles to usage. Provision of continuous and active support from nursing management could positively influence computer usage among professional nurses. A closer integration of information technology and computer literacy skills into existing nursing curricula would foster a positive attitude towards computer usage through early exposure. Responses indicated that change of mindset may be needed on the part of nursing management so that they begin to actively promote ready access to computers as a means of creating greater professionalism and collegiality. © 2011 Blackwell Publishing Ltd.

  4. Structural Integrity Program for INTEC Calcined Solids Storage Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey Bryant

    2008-08-30

    This report documents the activities of the structural integrity program at the Idaho Nuclear Technology and Engineering Center relevant to the high-level waste Calcined Solids Storage Facilities and associated equipment, as required by DOE M 435.1-1, 'Radioactive Waste Management Manual'. Based on the evaluation documented in this report, the Calcined Solids Storage Facilities are not leaking and are structurally sound for continued service. Recommendations are provided for continued monitoring of the Calcined Solids Storage Facilities.

  5. Energy Systems Integration Facility (ESIF): Facility Stewardship Plan, Revision 2.0

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Art [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hannegan, Bryan [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-09-01

    The U.S. Department of Energy (DOE), Office of Energy Efficiency and Renewable Energy, has established the Energy Systems Integration Facility (ESIF) on the campus of the National Renewable Energy Laboratory (NREL) and has designated it as a DOE user facility. This 182,500-sq. ft. research facility provides state-of-the-art laboratory and support infrastructure to optimize the design and performance of electrical, thermal, fuel, and information technologies and systems at scale. This Facility Stewardship Plan serves to provide DOE and other decision makers with information on the existing and expected capabilities of ESIF, and the expected performance metrics to be applied to ESIF operations. This Plan is a living document that will be updated and refined throughout the lifetime of the facility.

  6. Call Centre- Computer Telephone Integration

    Directory of Open Access Journals (Sweden)

    Dražen Kovačević

    2012-10-01

    Full Text Available Call centre largely came into being as a result of consumerneeds converging with enabling technology- and by the companiesrecognising the revenue opportunities generated by meetingthose needs thereby increasing customer satisfaction. Regardlessof the specific application or activity of a Call centre, customersatisfaction with the interaction is critical to the revenuegenerated or protected by the Call centre. Physical(v, Call centreset up is a place that includes computer, telephone and supervisorstation. Call centre can be available 24 hours a day - whenthe customer wants to make a purchase, needs information, orsimply wishes to register a complaint.

  7. System model of a natural circulation integral test facility

    Science.gov (United States)

    Galvin, Mark R.

    The Department of Nuclear Engineering and Radiation Health Physics (NE/RHP) at Oregon State University (OSU) has been developing an innovative modular reactor plant concept since being initiated with a Department of Energy (DoE) grant in 1999. This concept, the Multi-Application Small Light Water Reactor (MASLWR), is an integral pressurized water reactor (PWR) plant that utilizes natural circulation flow in the primary and employs advanced passive safety features. The OSU MASLWR test facility is an electrically heated integral effects facility, scaled from the MASLWR concept design, that has been previously used to assess the feasibility of the concept design safety approach. To assist in evaluating operational scenarios, a simulation tool that models the test facility and is based on both test facility experimental data and analytical methods has been developed. The tool models both the test facility electric core and a simulated nuclear core, allowing evaluation of a broad spectrum of operational scenarios to identify those scenarios that should be explored experimentally using the test facility or design-quality multi-physics tools. Using the simulation tool, the total cost of experimentation and analysis can be reduced by directing time and resources towards the operational scenarios of interest.

  8. Integrated biofuel facility, with carbon dioxide consumption and power generation

    Energy Technology Data Exchange (ETDEWEB)

    Powell, E.E.; Hill, G.A. [Saskatchewan Univ., Saskatoon, SK (Canada). Dept. of Chemical Engineering

    2009-07-01

    This presentation provided details of an economical design for a large-scale integrated biofuel facility for coupled production of bioethanol and biodiesel, with carbon dioxide capture and power generation. Several designs were suggested for both batch and continuous culture operations, taking into account all costs and revenues associated with the complete plant integration. The microalgae species Chlorella vulgaris was cultivated in a novel photobioreactor (PBR) in order to consume industrial carbon dioxide (CO{sub 2}). This photosynthetic culture can also act as a biocathode in a microbial fuel cell (MFC), which when coupled to a typical yeast anodic half cell, results in a complete biological MFC. The photosynthetic MFC produces electricity as well as valuable biomass and by-products. The use of this novel photosynthetic microalgae cathodic half cell in an integrated biofuel facility was discussed. A series of novel PBRs for continuous operation can be integrated into a large-scale bioethanol facility, where the PBRs serve as cathodic half cells and are coupled to the existing yeast fermentation tanks which act as anodic half cells. These coupled MFCs generate electricity for use within the biofuel facility. The microalgae growth provides oil for biodiesel production, in addition to the bioethanol from the yeast fermentation. The photosynthetic cultivation in the cathodic PBR also requires carbon dioxide, resulting in consumption of carbon dioxide from bioethanol production. The paper also discussed the effect of plant design on net present worth and internal rate of return. tabs., figs.

  9. Recent integral cross section validation measurements at the ASP facility

    CERN Document Server

    Packer, L W; Gilbert, M; Lilley, S; Pampin, R

    2013-01-01

    This work presents new integral data measured at the ASP 14 MeV neutron irradiation facility at Aldermaston in the UK, which has recently become available for fusion-related work through the CCFE materials programme. Measurements of reaction products from activation experiments using elemental foils were carried out using gamma spectrometry in a high efficiency, high-purity germanium (HPGe) detector and associated digital signal processing hardware. Following irradiation and rapid extraction to the measurement cell, gamma emissions were acquired with both energy and time bins. Integral cross section and half-life data have been derived from these measurements. Selected integral cross section values are presented from the measurement campaigns.

  10. Integrated management of facility, process, and output: data model perspective

    Institute of Scientific and Technical Information of China (English)

    LEE Seunghoon; HAN Soonhung; MUN Duhwan

    2012-01-01

    As the manufacturing industry matures,vast amounts of data related to products are created by many kinds of engineering systems during the manufacturing phase.These include data for a variety of facilities,manufacturing processes,and the input and output of each process (input material,by-products,and intermediate and final products). Effective operation and maintenance of manufacturing facilities and eco-friendly products are gradually becoming important issues due to increased environmental regulations and changes in the enterprise business model.For this reason,increased efficiency in data management is necessary in the manufacturing industry. In this paper,existing data models for the integration of lifecycle data are analyzed according to their application domains.After the analysis,information requirements for the integrated management of facility,process,and output data are developed.According to these requirements,a data model appropriate for this integration is proposed.As an application case study,the use of the proposed data model for the effective operation and maintenance of manufacturing facilities is presented.Finally,benefit,limitation,and improvement of the proposed data model are discussed.

  11. Entropy computing via integration over fractal measures.

    Science.gov (United States)

    Słomczynski, Wojciech; Kwapien, Jarosław; Zyczkowski, Karol

    2000-03-01

    We discuss the properties of invariant measures corresponding to iterated function systems (IFSs) with place-dependent probabilities and compute their Renyi entropies, generalized dimensions, and multifractal spectra. It is shown that with certain dynamical systems, one can associate the corresponding IFSs in such a way that their generalized entropies are equal. This provides a new method of computing entropy for some classical and quantum dynamical systems. Numerical techniques are based on integration over the fractal measures. (c) 2000 American Institute of Physics.

  12. Annual Summary of the Integrated Disposal Facility Performance Assessment 2012

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, R. [INTERA, Austin, TX (United States); Nichols, W. E. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)

    2012-12-27

    An annual summary of the adequacy of the Hanford Immobilized Low-Activity Waste (ILAW) Performance Assessment (PA) is required each year (DOE O 435.1 Chg 1,1 DOE M 435.1-1 Chg 1;2 and DOE/ORP-2000-013). The most recently approved PA is DOE/ORP-2000-24.4 The ILAW PA evaluated the adequacy of the ILAW disposal facility, now referred to as the Integrated Disposal Facility (IDF), for the safe disposal of vitrified Hanford Site tank waste.

  13. Hybrid Parallel Computation of Integration in GRACE

    CERN Document Server

    Yuasa, F; Kawabata, S; Perret-Gallix, D; Itakura, K; Hotta, Y; Okuda, M; Yuasa, Fukuko; Ishikawa, Tadashi; Kawabata, Setsuya; Perret-Gallix, Denis; Itakura, Kazuhiro; Hotta, Yukihiko; Okuda, Motoi

    2000-01-01

    With an integrated software package {\\tt GRACE}, it is possible to generate Feynman diagrams, calculate the total cross section and generate physics events automatically. We outline the hybrid method of parallel computation of the multi-dimensional integration of {\\tt GRACE}. We used {\\tt MPI} (Message Passing Interface) as the parallel library and, to improve the performance we embedded the mechanism of the dynamic load balancing. The reduction rate of the practical execution time was studied.

  14. AVES: A Computer Cluster System approach for INTEGRAL Scientific Analysis

    Science.gov (United States)

    Federici, M.; Martino, B. L.; Natalucci, L.; Umbertini, P.

    The AVES computing system, based on an "Cluster" architecture is a fully integrated, low cost computing facility dedicated to the archiving and analysis of the INTEGRAL data. AVES is a modular system that uses the software resource manager (SLURM) and allows almost unlimited expandibility (65,536 nodes and hundreds of thousands of processors); actually is composed by 30 Personal Computers with Quad-Cores CPU able to reach the computing power of 300 Giga Flops (300x10{9} Floating point Operations Per Second), with 120 GB of RAM and 7.5 Tera Bytes (TB) of storage memory in UFS configuration plus 6 TB for users area. AVES was designed and built to solve growing problems raised from the analysis of the large data amount accumulated by the INTEGRAL mission (actually about 9 TB) and due to increase every year. The used analysis software is the OSA package, distributed by the ISDC in Geneva. This is a very complex package consisting of dozens of programs that can not be converted to parallel computing. To overcome this limitation we developed a series of programs to distribute the workload analysis on the various nodes making AVES automatically divide the analysis in N jobs sent to N cores. This solution thus produces a result similar to that obtained by the parallel computing configuration. In support of this we have developed tools that allow a flexible use of the scientific software and quality control of on-line data storing. The AVES software package is constituted by about 50 specific programs. Thus the whole computing time, compared to that provided by a Personal Computer with single processor, has been enhanced up to a factor 70.

  15. Teaching Cardiovascular Integrations with Computer Laboratories.

    Science.gov (United States)

    Peterson, Nils S.; Campbell, Kenneth B.

    1985-01-01

    Describes a computer-based instructional unit in cardiovascular physiology. The program (which employs simulated laboratory experimental techniques with a problem-solving format is designed to supplement an animal laboratory and to offer students an integrative approach to physiology through use of microcomputers. Also presents an overview of the…

  16. Integrating Computer-Mediated Communication Strategy Instruction

    Science.gov (United States)

    McNeil, Levi

    2016-01-01

    Communication strategies (CSs) play important roles in resolving problematic second language interaction and facilitating language learning. While studies in face-to-face contexts demonstrate the benefits of communication strategy instruction (CSI), there have been few attempts to integrate computer-mediated communication and CSI. The study…

  17. Vehicle Testing and Integration Facility; NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-03-02

    Engineers at the National Renewable Energy Laboratory’s (NREL’s) Vehicle Testing and Integration Facility (VTIF) are developing strategies to address two separate but equally crucial areas of research: meeting the demands of electric vehicle (EV) grid integration and minimizing fuel consumption related to vehicle climate control. Dedicated to renewable and energy-efficient solutions, the VTIF showcases technologies and systems designed to increase the viability of sustainably powered vehicles. NREL researchers instrument every class of on-road vehicle, conduct hardware and software validation for EV components and accessories, and develop analysis tools and technology for the Department of Energy, other government agencies, and industry partners.

  18. VISTA : thermal-hydraulic integral test facility for SMART reactor

    Energy Technology Data Exchange (ETDEWEB)

    Choi, K. Y.; Park, H. S.; Cho, S.; Park, C. K.; Lee, S. J.; Song, C. H.; Chung, M. K. [KAERI, Taejon (Korea, Republic of)

    2003-07-01

    Preliminary performance tests were carried out using the thermal-hydraulic integral test facility, VISTA (Experimental Verification by Integral Simulation of Transients and Accidents), which has been constructed to simulate the SMART-P. The VISTA facility is an integral test facility including the primary and secondary systems as well as safety-related Passive Residual Heat Removal (PRHR) systems. Its scaled ratio with respect to the SMART-P is 1/1 in height and 1/96 in volume and heater power. Several steady states and power changing tests have been carried out to verify the overall thermal hydraulic primary and secondary characteristics in the range of 10% to 100% power operation. As for the preliminary results, the steady state conditions were found to coincide with the expected design values of the SMART-P. But the major thermal hydraulic parameters are greatly affected by the initial water level and the nitrogen pressure in the reactor's upper annular cavity. The power step/ramp changing tests are successfully carried out and the system responses are observed. The primary natural circulation operation is achieved, but advanced control logics need to be developed to reach the natural circulation mode without pressure excursion. In the PRHR transient tests, the natural circulation flow rate through the PRHR system was found to be about 10 percent in the early phases of PRHR operation.

  19. Accuracy of Stokes integration for geoid computation

    Science.gov (United States)

    Ismail, Zahra; Jamet, Olivier; Altamimi, Zuheir

    2014-05-01

    Geoid determination by remove-compute-restore (RCR) technique involves the application of Stokes's integral on reduced gravity anomalies. Reduced gravity anomalies are obtained through interpolation after removing low degree gravity signal from space spherical harmonic model and high frequency from topographical effects and cover a spectre ranging from degree 150-200. Stokes's integral is truncated to a limited region around the computation point producing an error that will be reducing by a modification of Stokes's kernel. We study Stokes integral accuracy on synthetic signal of various frequency ranges, produced with EGM2008 spherical harmonic coefficients up to degree 2000. We analyse the integration error according to the frequency range of signal, the resolution of gravity anomaly grid and the radius of Stokes integration. The study shows that the behaviour of the relative errors is frequency independent. The standard Stokes kernel is though insufficient to produce 1cm geoid accuracy without a removal of the major part of the gravity signal up to degree 600. The Integration over an area of radius greater than 3 degree does not improve accuracy improvement. The results are compared to a similar experiment using the modified Stokes kernel formula (Ellmann2004, Sjöberg2003). References: Ellmann, A. (2004) The geoid for the Baltic countries determined by least-squares modification of Stokes formula. Sjöberg, LE (2003). A general model of modifying Stokes formula and its least-squares solution Journal of Geodesy, 77. 459-464.

  20. Cuby: An integrative framework for computational chemistry.

    Science.gov (United States)

    Řezáč, Jan

    2016-05-15

    Cuby is a computational chemistry framework written in the Ruby programming language. It provides unified access to a wide range of computational methods by interfacing external software and it implements various protocols that operate on their results. Using structured input files, elementary calculations can be combined into complex workflows. For users, Cuby provides a unified and userfriendly way to automate their work, seamlessly integrating calculations carried out in different computational chemistry programs. For example, the QM/MM module allows combining methods across the interfaced programs and the builtin molecular dynamics engine makes it possible to run a simulation on the resulting potential. For programmers, it provides high-level, object-oriented environment that allows rapid development and testing of new methods and computational protocols. The Cuby framework is available for download at http://cuby4.molecular.cz. © 2016 Wiley Periodicals, Inc.

  1. A Supply Chain Design Problem Integrated Facility Unavailabilities Management

    Directory of Open Access Journals (Sweden)

    Fouad Maliki

    2016-08-01

    Full Text Available A supply chain is a set of facilities connected together in order to provide products to customers. The supply chain is subject to random failures caused by different factors which cause the unavailability of some sites. Given the current economic context, the management of these unavailabilities is becoming a strategic choice to ensure the desired reliability and availability levels of the different supply chain facilities. In this work, we treat two problems related to the field of supply chain, namely the design and unavailabilities management of logistics facilities. Specifically, we consider a stochastic distribution network with consideration of suppliers' selection, distribution centres location (DCs decisions and DCs’ unavailabilities management. Two resolution approaches are proposed. The first approach called non-integrated consists on define the optimal supply chain structure using an optimization approach based on genetic algorithms (GA, then to simulate the supply chain performance with the presence of DCs failures. The second approach called integrated approach is to consider the design of the supply chain problem and unavailabilities management of DCs in the same model. Note that, we replace each unavailable DC by performing a reallocation using GA in the two approaches. The obtained results of the two approaches are detailed and compared showing their effectiveness.

  2. Integrated Disposal Facility FY2011 Glass Testing Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Pierce, Eric M.; Bacon, Diana H.; Kerisit, Sebastien N.; Windisch, Charles F.; Cantrell, Kirk J.; Valenta, Michelle M.; Burton, Sarah D.; Westsik, Joseph H.

    2011-09-29

    Pacific Northwest National Laboratory was contracted by Washington River Protection Solutions, LLC to provide the technical basis for estimating radionuclide release from the engineered portion of the disposal facility (e.g., source term). Vitrifying the low-activity waste at Hanford is expected to generate over 1.6 x 10{sup 5} m{sup 3} of glass (Certa and Wells 2010). The volume of immobilized low-activity waste (ILAW) at Hanford is the largest in the DOE complex and is one of the largest inventories (approximately 8.9 x 10{sup 14} Bq total activity) of long-lived radionuclides, principally {sup 99}Tc (t{sub 1/2} = 2.1 x 10{sup 5}), planned for disposal in a low-level waste (LLW) facility. Before the ILAW can be disposed, DOE must conduct a performance assessment (PA) for the Integrated Disposal Facility (IDF) that describes the long-term impacts of the disposal facility on public health and environmental resources. As part of the ILAW glass testing program PNNL is implementing a strategy, consisting of experimentation and modeling, in order to provide the technical basis for estimating radionuclide release from the glass waste form in support of future IDF PAs. The purpose of this report is to summarize the progress made in fiscal year (FY) 2011 toward implementing the strategy with the goal of developing an understanding of the long-term corrosion behavior of low-activity waste glasses.

  3. Integrative Genomics and Computational Systems Medicine

    Energy Technology Data Exchange (ETDEWEB)

    McDermott, Jason E.; Huang, Yufei; Zhang, Bing; Xu, Hua; Zhao, Zhongming

    2014-01-01

    The exponential growth in generation of large amounts of genomic data from biological samples has driven the emerging field of systems medicine. This field is promising because it improves our understanding of disease processes at the systems level. However, the field is still in its young stage. There exists a great need for novel computational methods and approaches to effectively utilize and integrate various omics data.

  4. The Argonne Leadership Computing Facility 2010 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Drugan, C. (LCF)

    2011-05-09

    Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued to provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers

  5. Computing Fourier integral operators with caustics

    Science.gov (United States)

    Caday, Peter

    2016-12-01

    Fourier integral operators (FIOs) have widespread applications in imaging, inverse problems, and PDEs. An implementation of a generic algorithm for computing FIOs associated with canonical graphs is presented, based on a recent paper of de Hoop et al. Given the canonical transformation and principal symbol of the operator, a preprocessing step reduces application of an FIO approximately to multiplications, pushforwards and forward and inverse discrete Fourier transforms, which can be computed in O({N}n+(n-1)/2{log}N) time for an n-dimensional FIO. The same preprocessed data also allows computation of the inverse and transpose of the FIO, with identical runtime. Examples demonstrate the algorithm’s output, and easily extendible MATLAB/C++ source code is available from the author.

  6. 78 FR 18353 - Guidance for Industry: Blood Establishment Computer System Validation in the User's Facility...

    Science.gov (United States)

    2013-03-26

    ... HUMAN SERVICES Food and Drug Administration Guidance for Industry: Blood Establishment Computer System... ``Guidance for Industry: Blood Establishment Computer System Validation in the User's Facility'' dated April... establishment computer system validation program, consistent with recognized principles of software...

  7. Operational readiness: an integral part of the facility planning process.

    Science.gov (United States)

    Kidd, LeeAnne; Howe, Rob

    2014-01-01

    Large capital building projects benefit from an operational readiness strategy prior to new facility occupancy. St. Joseph's Healthcare used a structured approach for their readiness planning that included individual work plan meetings, tools for ensuring integration across programs and services and process improvement support to ensure a smooth transition. Over 1100 staff were oriented using a Train-the-Trainer model. Significant effort was required to co-ordinate the customized training, which involved "staffing up" to ensure sufficient resources for backfill. Operational readiness planning places additional demands on managers, requiring support and assistance from dedicated resources both prior to occupancy and several months post-move.

  8. Integrated Disposal Facility FY 2012 Glass Testing Summary Report, Erratum

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Gary L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-09-02

    Erratum This report refers to or contains Kg values for glasses LAWA44, LAWB45 and LAWC22 affected by calculations errors as identified by Papathanassiu et al. (2011) The corrected Kg values are reported in an erratum included in the revised version of the original report. The revised report can be referenced as follows: Pierce E. M. et al. (2004) Waste Form Release Data Package for the 2005 Integrated Disposal Facility Performance Assessment. PNNL-14805 Rev. 0 Erratum. Pacific Northwest National Laboratory, Richland, WA, USA.

  9. Integrated Disposal Facility FY2011 Glass Testing Summary Report Erratum

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Gary L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-09-06

    This report refers to or contains Kg values for glasses LAWA44, LAWB45 and LAWC22 affected by calculations errors as identified by Papathanassiu et al. (2011). The corrected Kg values are reported in an erratum included in the revised version of the original report. The revised report can be referenced as follows: Pierce E. M. et al. (2004) Waste Form Release Data Package for the 2005 Integrated Disposal Facility Performance Assessment. PNNL-14805 Rev. 0 Erratum. Pacific Northwest National Laboratory, Richland, WA, USA.

  10. Magnetohydrodynamic projects at the CDIF (Component Development and Integration Facility)

    Energy Technology Data Exchange (ETDEWEB)

    1990-01-01

    This quarterly technical progress report presents the tasks accomplished at the Component Development and Integration Facility during the fourth quarter of FY90. Areas of technical progress this quarter included: coal system development; seed system development; test bay modification; channel power dissipation and distribution system development; oxygen system storage upgrade; iron core magnet thermal protection system oxygen checkout; TRW slag rejector/CDIF slag removal project; stack gas/environmental compliance upgrade; coal-fired combustor support; 1A channels fabrication and assembly; support of Mississippi State University diagnostic testing; test operations and results; data enhancement; data analysis and modeling; technical papers; and projected activities. 2 tabs.

  11. Integrated Disposal Facility FY 2012 Glass Testing Summary Report, Erratum

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Gary L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-09-02

    This report refers to or contains Kg values for glasses LAWA44, LAWB45 and LAWC22 affected by calculations errors as identified by Papathanassiu et al. (2011) The corrected Kg values are reported in an erratum included in the revised version of the original report. The revised report can be referenced as follows: Pierce E. M. et al. (2004) Waste Form Release Data Package for the 2005 Integrated Disposal Facility Performance Assessment. PNNL-14805 Rev. 0 Erratum. Pacific Northwest National Laboratory, Richland, WA, USA.

  12. Integrated Disposal Facility FY2011 Glass Testing Summary Report. Erratum

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Gary L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-09-06

    This report refers to or contains Kg values for glasses LAWA44, LAWB45 and LAWC22 affected by calculations errors as identified by Papathanassiu et al. (2011). The corrected Kg values are reported in an erratum included in the revised version of the original report. The revised report can be referenced as follows: Pierce E. M. et al. (2004) Waste Form Release Data Package for the 2005 Integrated Disposal Facility Performance Assessment. PNNL-14805 Rev. 0 Erratum. Pacific Northwest National Laboratory, Richland, WA, USA.

  13. Integrated Disposal Facility FY 2012 Glass Testing Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Pierce, Eric M.; Kerisit, Sebastien N.; Krogstad, Eirik J.; Burton, Sarah D.; Bjornstad, Bruce N.; Freedman, Vicky L.; Cantrell, Kirk J.; Snyder, Michelle MV; Crum, Jarrod V.; Westsik, Joseph H.

    2013-03-29

    PNNL is conducting work to provide the technical basis for estimating radionuclide release from the engineered portion of the disposal facility for Hanford immobilized low-activity waste (ILAW). Before the ILAW can be disposed, DOE must conduct a performance assessment (PA) for the Integrated Disposal Facility (IDF) that describes the long-term impacts of the disposal facility on public health and environmental resources. As part of the ILAW glass testing program, PNNL is implementing a strategy, consisting of experimentation and modeling, to provide the technical basis for estimating radionuclide release from the glass waste form in support of future IDF PAs. Key activities in FY12 include upgrading the STOMP/eSTOMP codes to do near-field modeling, geochemical modeling of PCT tests to determine the reaction network to be used in the STOMP codes, conducting PUF tests on selected glasses to simulate and accelerate glass weathering, developing a Monte Carlo simulation tool to predict the characteristics of the weathered glass reaction layer as a function of glass composition, and characterizing glasses and soil samples exhumed from an 8-year lysimeter test. The purpose of this report is to summarize the progress made in fiscal year (FY) 2012 and the first quarter of FY 2013 toward implementing the strategy with the goal of developing an understanding of the long-term corrosion behavior of LAW glasses.

  14. Consistent Posttest Calculations for LOCA Scenarios in LOBI Integral Facility

    Directory of Open Access Journals (Sweden)

    F. Reventós

    2012-01-01

    Full Text Available Integral test facilities (ITFs are one of the main tools for the validation of best estimate thermalhydraulic system codes. The experimental data are also of great value when compared to the experiment-scaled conditions in a full NPP. The LOBI was a single plus a triple-loop (simulated by one loop test facility electrically heated to simulate a 1300 MWe PWR. The scaling factor was 712 for the core power, volume, and mass flow. Primary and secondary sides contained all main active elements. Tests were performed for the characterization of phenomenologies relevant to large and small break LOCAs and special transients in PWRs. The paper presents the results of three posttest calculations of LOBI experiments. The selected experiments are BL-30, BL-44, and A1-84. They are LOCA scenarios of different break sizes and with different availability of safety injection components. The goal of the analysis is to improve the knowledge of the phenomena occurred in the facility in order to use it in further studies related to qualifying nodalizations of actual plants or to establish accuracy data bases for uncertainty methodologies. An example of procedure of implementing changes in a common nodalization valid for simulating tests occurred in a specific ITF is presented along with its confirmation based on posttests results.

  15. Commissioning of the ATLAS thermal-hydraulic integral test facility

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yeon-Sik [Thermal Hydraulics Safety Research Division, Korea Atomic Energy Research Institute, 1045 Daedeokdaero, Yuseong-gu, Daejeon 305-353 (Korea, Republic of)], E-mail: yskim3@kaeri.re.kr; Choi, Ki-Yong; Park, Hyeon-Sik; Cho, Seok; Kim, Bok-Deug; Choi, Nam-Hyeon; Baek, Won-Pil [Thermal Hydraulics Safety Research Division, Korea Atomic Energy Research Institute, 1045 Daedeokdaero, Yuseong-gu, Daejeon 305-353 (Korea, Republic of)

    2008-10-15

    KAERI recently constructed a new thermal-hydraulic integral test facility for advanced pressurized water reactors (PWRs) - ATLAS. The ATLAS facility has the following characteristics: (a) 1/2-height and length, 1/288-volume, and full pressure simulation of APR1400, (b) maintaining a geometrical similarity with APR1400 including 2(hot legs) x 4(cold legs) reactor coolant loops, direct vessel injection (DVI) of emergency core cooling water, integrated annular downcomer, etc., (c) incorporation of specific design characteristics of OPR1000 such as cold leg injection and low-pressure safety injection pumps, (d) maximum 10% of the scaled nominal core power. The ATLAS will mainly be used to simulate various accident and transient scenarios for evolutionary PWRs, OPR1000 and APR1400: the simulation capability of broad scenarios including the reflood phase of a large-break loss-of-coolant accident (LOCA), small-break LOCA scenarios including DVI line breaks, a steam generator tube rupture, a main steam line break, a feed line break, a mid-loop operation, etc. The ATLAS is now in operation after an extensive series of commissioning tests in 2006.

  16. High Resolution Muon Computed Tomography at Neutrino Beam Facilities

    CERN Document Server

    Suerfu, Burkhant

    2015-01-01

    X-ray computed tomography (CT) has an indispensable role in constructing 3D images of objects made from light materials. However, limited by absorption coefficients, X-rays cannot deeply penetrate materials such as copper and lead. Here we show via simulation that muon beams can provide high resolution tomographic images of dense objects and of structures within the interior of dense objects. The effects of resolution broadening from multiple scattering diminish with increasing muon momentum. As the momentum of the muon increases, the contrast of the image goes down and therefore requires higher resolution in the muon spectrometer to resolve the image. The variance of the measured muon momentum reaches a minimum and then increases with increasing muon momentum. The impact of the increase in variance is to require a higher integrated muon flux to reduce fluctuations. The flux requirements and level of contrast needed for high resolution muon computed tomography are well matched to the muons produced in the pio...

  17. Integrating network awareness in ATLAS distributed computing

    CERN Document Server

    De, K; The ATLAS collaboration; Klimentov, A; Maeno, T; Mckee, S; Nilsson, P; Petrosyan, A; Vukotic, I; Wenaus, T

    2014-01-01

    A crucial contributor to the success of the massively scaled global computing system that delivers the analysis needs of the LHC experiments is the networking infrastructure upon which the system is built. The experiments have been able to exploit excellent high-bandwidth networking in adapting their computing models for the most efficient utilization of resources. New advanced networking technologies now becoming available such as software defined networks hold the potential of further leveraging the network to optimize workflows and dataflows, through proactive control of the network fabric on the part of high level applications such as experiment workload management and data management systems. End to end monitoring of networking and data flow performance further allows applications to adapt based on real time conditions. We will describe efforts underway in ATLAS on integrating network awareness at the application level, particularly in workload management.

  18. Integrated Component-based Data Acquisition Systems for Aerospace Test Facilities

    Science.gov (United States)

    Ross, Richard W.

    2001-01-01

    The Multi-Instrument Integrated Data Acquisition System (MIIDAS), developed by the NASA Langley Research Center, uses commercial off the shelf (COTS) products, integrated with custom software, to provide a broad range of capabilities at a low cost throughout the system s entire life cycle. MIIDAS combines data acquisition capabilities with online and post-test data reduction computations. COTS products lower purchase and maintenance costs by reducing the level of effort required to meet system requirements. Object-oriented methods are used to enhance modularity, encourage reusability, and to promote adaptability, reducing software development costs. Using only COTS products and custom software supported on multiple platforms reduces the cost of porting the system to other platforms. The post-test data reduction capabilities of MIIDAS have been installed at four aerospace testing facilities at NASA Langley Research Center. The systems installed at these facilities provide a common user interface, reducing the training time required for personnel that work across multiple facilities. The techniques employed by MIIDAS enable NASA to build a system with a lower initial purchase price and reduced sustaining maintenance costs. With MIIDAS, NASA has built a highly flexible next generation data acquisition and reduction system for aerospace test facilities that meets customer expectations.

  19. Flow simulation of the Component Development Integration Facility magnetohydrodynamic power train system

    Energy Technology Data Exchange (ETDEWEB)

    Chang, S.L.; Lottes, S.A.; Bouillard, J.X.; Petrick, M.

    1997-11-01

    This report covers application of Argonne National Laboratory`s (ANL`s) computer codes to simulation and analysis of components of the magnetohydrodynamic (MHD) power train system at the Component Development and Integration Facility (CDIF). Major components of the system include a 50-MWt coal-fired, two-stage combustor and an MHD channel. The combustor, designed and built by TRW, includes a deswirl section between the first and the second-stage combustor and a converging nozzle following the second-stage combustor, which connects to the MHD channel. ANL used computer codes to simulate and analyze flow characteristics in various components of the MHD system. The first-stage swirl combustor was deemed a mature technology and, therefore, was not included in the computer simulation. Several versions of the ICOMFLO computer code were used for the deswirl section and second-stage combustor. The MGMHD code, upgraded with a slag current leakage submodel, was used for the MHD channel. Whenever possible data from the test facilities were used to aid in calibrating parameters in the computer code, to validate the computer code, or to set base-case operating conditions for computations with the computer code. Extensive sensitivity and parametric studies were done on cold-flow mixing in the second-stage combustor, reacting flow in the second-stage combustor and converging nozzle, and particle-laden flow in the deswirl zone of the first-stage combustor, the second-stage combustor, and the converging nozzle. These simulations with subsequent analysis were able to show clearly in flow patterns and various computable measures of performance a number of sensitive and problematical areas in the design of the power train. The simulations of upstream components also provided inlet parameter profiles for simulation of the MHD power generating channel. 86 figs., 18 tabs.

  20. Integrated Electrical and Thermal Grid Facility - Testing of Future Microgrid Technologies

    Directory of Open Access Journals (Sweden)

    Sundar Raj Thangavelu

    2015-09-01

    Full Text Available This paper describes the Experimental Power Grid Centre (EPGC microgrid test facility, which was developed to enable research, development and testing for a wide range of distributed generation and microgrid technologies. The EPGC microgrid facility comprises a integrated electrical and thermal grid with a flexible and configurable architecture, and includes various distributed energy resources and emulators, such as generators, renewable, energy storage technologies and programmable load banks. The integrated thermal grid provides an opportunity to harness waste heat produced by the generators for combined heat, power and cooling applications, and support research in optimization of combined electrical-thermal systems. Several case studies are presented to demonstrate the testing of different control and operation strategies for storage systems in grid-connected and islanded microgrids. One of the case studies also demonstrates an integrated thermal grid to convert waste heat to useful energy, which thus far resulted in a higher combined energy efficiency. Experiment results confirm that the facility enables testing and evaluation of grid technologies and practical problems that may not be apparent in a computer simulated environment.

  1. Evaluating Computer Technology Integration in a Centralized School System

    Science.gov (United States)

    Eteokleous, N.

    2008-01-01

    The study evaluated the current situation in Cyprus elementary classrooms regarding computer technology integration in an attempt to identify ways of expanding teachers' and students' experiences with computer technology. It examined how Cypriot elementary teachers use computers, and the factors that influence computer integration in their…

  2. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    Energy Technology Data Exchange (ETDEWEB)

    Papka, M.; Messina, P.; Coffey, R.; Drugan, C. (LCF)

    2012-08-16

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursor to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to

  3. The LOBI Integral System Test Facility Experimental Programme

    Directory of Open Access Journals (Sweden)

    Carmelo Addabbo

    2012-01-01

    Full Text Available The LOBI project has been carried out in the framework of the European Commission Reactor Safety Research Programme in close collaboration with institutional and/or industrial research organizations of EC member countries. The primary objective of the research programme was the generation of an experimental data base for the assessment of the predictive capabilities of thermal-hydraulic system codes used in pressurised water reactor safety analysis. Within this context, experiments have been conducted in the LOBI integral system test facility designed, constructed, and operated (1979–1991 at the Ispra Site of the Joint Research Centre. This paper provides a historical perspective and summarizes major achievements of the research programme which has represented an effective approach to international collaboration in the field of reactor safety research and development. Emphasis is also placed on knowledge management aspects of the acquired experimental data base and on related online open access/retrieval user functionalities.

  4. Heterogeneous Electronics – Wafer Level Integration, Packaging, and Assembly Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This facility integrates active electronics with microelectromechanical (MEMS) devices at the miniature system scale. It obviates current size-, weight-, and power...

  5. Heterogeneous Electronics – Wafer Level Integration, Packaging, and Assembly Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This facility integrates active electronics with microelectromechanical (MEMS) devices at the miniature system scale. It obviates current size-, weight-, and power...

  6. A service-based SLA (Service Level Agreement) for the RACF (RHIC and ATLAS computing facility) at brookhaven national lab

    Science.gov (United States)

    Karasawa, Mizuka; Chan, Tony; Smith, Jason

    2010-04-01

    The RACF provides computing support to a broad spectrum of scientific programs at Brookhaven. The continuing growth of the facility, the diverse needs of the scientific programs and the increasingly prominent role of distributed computing requires the RACF to change from a system to a service-based SLA with our user communities. A service-based SLA allows the RACF to coordinate more efficiently the operation, maintenance and development of the facility by mapping out a matrix of system and service dependencies and by creating a new, configurable alarm management layer that automates service alerts and notification of operations staff. This paper describes the adjustments made by the RACF to transition to a service-based SLA, including the integration of its monitoring software, alarm notification mechanism and service ticket system at the facility to make the new SLA a reality.

  7. Computational Modeling in Support of High Altitude Testing Facilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in propulsion test facility design and development by assessing risks, identifying failure modes and predicting...

  8. Computational Modeling in Support of High Altitude Testing Facilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in rocket engine test facility design and development by assessing risks, identifying failure modes and predicting...

  9. Knowledge Management tools integration within DLR's concurrent engineering facility

    Science.gov (United States)

    Lopez, R. P.; Soragavi, G.; Deshmukh, M.; Ludtke, D.

    The complexity of space endeavors has increased the need for Knowledge Management (KM) tools. The concept of KM involves not only the electronic storage of knowledge, but also the process of making this knowledge available, reusable and traceable. Establishing a KM concept within the Concurrent Engineering Facility (CEF) has been a research topic of the German Aerospace Centre (DLR). This paper presents the current KM tools of the CEF: the Software Platform for Organizing and Capturing Knowledge (S.P.O.C.K.), the data model Virtual Satellite (VirSat), and the Simulation Model Library (SimMoLib), and how their usage improved the Concurrent Engineering (CE) process. This paper also exposes the lessons learned from the introduction of KM practices into the CEF and elaborates a roadmap for the further development of KM in CE activities at DLR. The results of the application of the Knowledge Management tools have shown the potential of merging the three software platforms with their functionalities, as the next step towards the fully integration of KM practices into the CE process. VirSat will stay as the main software platform used within a CE study, and S.P.O.C.K. and SimMoLib will be integrated into VirSat. These tools will support the data model as a reference and documentation source, and as an access to simulation and calculation models. The use of KM tools in the CEF aims to become a basic practice during the CE process. The settlement of this practice will result in a much more extended knowledge and experience exchange within the Concurrent Engineering environment and, consequently, the outcome of the studies will comprise higher quality in the design of space systems.

  10. Sensor fusion control system for computer integrated manufacturing

    CSIR Research Space (South Africa)

    Kumile, CM

    2007-08-01

    Full Text Available of products in unpredictable quantities. Computer Integrated Manufacturing (CIM) systems plays an important role towards integrating such flexible systems. This paper presents a methodology of increasing flexibility and reusability of a generic CIM cell...

  11. Integrated Computational System for Electrochemical Device Design and Simulation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Illinois Rocstar LLC proposes to develop and demonstrate the use of an integrated computational environment and infrastructure for electrochemical device design and...

  12. A methodology for assessing computer software applicability to inventory and facility management

    OpenAIRE

    Paul, Debashis

    1989-01-01

    Computer applications have become popular and widespread in architecture and other related fields. While the architect uses a computer for design and construction of a building, the user takes the advantage of computer for maintenance of the building. Inventory and facility management are two such fields where computer applications have become predominant. The project has investigated the use and application of different commercially available computer software in the above men...

  13. Integrated research of parallel computing: Status and future

    Institute of Scientific and Technical Information of China (English)

    CHEN GuoLiang; SUN GuangZhong; XU Yun; LONG Bai

    2009-01-01

    In the past twenty years, the research group in University of Science and Technology of China has de-veloped an integrated research method for parallel computing, which is a combination of "Architecture-Algorithm-Programming-Application". This method is also called the ecological environment of parallel computing research. In this paper, we survey the current status of integrated research method for par-allel computing and by combining the impact of multi-core systems, cloud computing and personal high performance computer, we present our outlook on the future development of parallel computing.

  14. National facility for advanced computational science: A sustainable path to scientific discovery

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  15. Oxy-Combustion Burner and Integrated Pollutant Removal Research and Development Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Mark Schoenfield; Manny Menendez; Thomas Ochs; Rigel Woodside; Danylo Oryshchyn

    2012-09-30

    A high flame temperature oxy-combustion test facility consisting of a 5 MWe equivalent test boiler facility and 20 KWe equivalent IPR® was constructed at the Hammond, Indiana manufacturing site. The test facility was operated natural gas and coal fuels and parametric studies were performed to determine the optimal performance conditions and generated the necessary technical data required to demonstrate the technologies are viable for technical and economic scale-up. Flame temperatures between 4930-6120F were achieved with high flame temperature oxy-natural gas combustion depending on whether additional recirculated flue gases are added to balance the heat transfer. For high flame temperature oxy-coal combustion, flame temperatures in excess of 4500F were achieved and demonstrated to be consistent with computational fluid dynamic modeling of the burner system. The project demonstrated feasibility and effectiveness of the Jupiter Oxygen high flame temperature oxy-combustion process with Integrated Pollutant Removal process for CCS and CCUS. With these technologies total parasitic power requirements for both oxygen production and carbon capture currently are in the range of 20% of the gross power output. The Jupiter Oxygen high flame temperature oxy-combustion process has been demonstrated at a Technology Readiness Level of 6 and is ready for commencement of a demonstration project.

  16. Computer-Based Integrated Learning Systems: Research and Theory.

    Science.gov (United States)

    Hativa, Nira, Ed.; Becker, Henry Jay, Ed.

    1994-01-01

    The eight chapters of this theme issue discuss recent research and theory concerning computer-based integrated learning systems. Following an introduction about their theoretical background and current use in schools, the effects of using computer-based integrated learning systems in the elementary school classroom are considered. (SLD)

  17. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    CERN Document Server

    Anisenkov, Alexey; The ATLAS collaboration; Alandes Pradillo, Maria

    2016-01-01

    AGIS is the information system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing (ADC) applications and services. In this note, we describe the evolution and the recent developments of AGIS functionalities, related to integration of new technologies recently become widely used in ATLAS Computing like flexible computing utilization of opportunistic Cloud and HPC resources, ObjectStore services integration for Distributed Data Management (Rucio) and ATLAS workload management (PanDA) systems, unified storage protocols declaration required for PandDA Pilot site movers and others.

  18. The Overview of the National Ignition Facility Distributed Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Lagin, L J; Bettenhausen, R C; Carey, R A; Estes, C M; Fisher, J M; Krammen, J E; Reed, R K; VanArsdall, P J; Woodruff, J P

    2001-10-15

    The Integrated Computer Control System (ICCS) for the National Ignition Facility (NIF) is a layered architecture of 300 front-end processors (FEP) coordinated by supervisor subsystems including automatic beam alignment and wavefront control, laser and target diagnostics, pulse power, and shot control timed to 30 ps. FEP computers incorporate either VxWorks on PowerPC or Solaris on UltraSPARC processors that interface to over 45,000 control points attached to VME-bus or PCI-bus crates respectively. Typical devices are stepping motors, transient digitizers, calorimeters, and photodiodes. The front-end layer is divided into another segment comprised of an additional 14,000 control points for industrial controls including vacuum, argon, synthetic air, and safety interlocks implemented with Allen-Bradley programmable logic controllers (PLCs). The computer network is augmented asynchronous transfer mode (ATM) that delivers video streams from 500 sensor cameras monitoring the 192 laser beams to operator workstations. Software is based on an object-oriented framework using CORBA distribution that incorporates services for archiving, machine configuration, graphical user interface, monitoring, event logging, scripting, alert management, and access control. Software coding using a mixed language environment of Ada95 and Java is one-third complete at over 300 thousand source lines. Control system installation is currently under way for the first 8 beams, with project completion scheduled for 2008.

  19. [Use of personal computers in forensic medicine facilities].

    Science.gov (United States)

    Vorel, F

    1995-08-01

    The authors present a brief account of possibilities to use computers, type PC, in departments of forensic medicine and discuss basic technical and programme equipment. In the author's opinion the basic reason for using computers is to create an extensive database of post-mortem findings which would make it possible to process them on a large scale and use them for research and prevention. Introduction of computers depends on the management of the department and it is necessary to persuade workers-future users of computers-of the advantages associated with their use.

  20. Computer science in Dutch secondary education: independent or integrated?

    NARCIS (Netherlands)

    Sijde, van der Pieter C.; Doornekamp, B. Gerard

    1992-01-01

    Nowadays, in Dutch secondary education, computer science is integrated within school subjects. About ten years ago computer science was considered an independent subject, but in the mid-1980s this idea changed. In our study we investigated whether the objectives of teaching computer science as an in

  1. Computer science in Dutch secondary education: independent or integrated?

    NARCIS (Netherlands)

    van der Sijde, Peter; Doornekamp, B.G.

    1992-01-01

    Nowadays, in Dutch secondary education, computer science is integrated within school subjects. About ten years ago computer science was considered an independent subject, but in the mid-1980s this idea changed. In our study we investigated whether the objectives of teaching computer science as an

  2. Integrating Computer Ethics across the Curriculum: A Case Study

    Science.gov (United States)

    Ben-Jacob, Marion G.

    2005-01-01

    There is an increased use of computers in the educational environment of today that compels educators and learners to be informed about computer ethics and the related social and legal issues. This paper addresses different approaches for integrating computer ethics across the curriculum. Included are ideas for online and on-site workshops, the…

  3. Integral Test Facility PKL: Experimental PWR Accident Investigation

    OpenAIRE

    2012-01-01

    Investigations of the thermal-hydraulic behavior of pressurized water reactors under accident conditions have been carried out in the PKL test facility at AREVA NP in Erlangen, Germany for many years. The PKL facility models the entire primary side and significant parts of the secondary side of a pressurized water reactor (PWR) at a height scale of 1 : 1. Volumes, power ratings and mass flows are scaled with a ratio of 1 : 145. The experimental facility consists of 4 primary loops with circul...

  4. Final technical report for DOE Computational Nanoscience Project: Integrated Multiscale Modeling of Molecular Computing Devices

    Energy Technology Data Exchange (ETDEWEB)

    Cummings, P. T.

    2010-02-08

    This document reports the outcomes of the Computational Nanoscience Project, "Integrated Multiscale Modeling of Molecular Computing Devices". It includes a list of participants and publications arising from the research supported.

  5. Language Facilities for Programming User-Computer Dialogues.

    Science.gov (United States)

    Lafuente, J. M.; Gries, D.

    1978-01-01

    Proposes extensions to PASCAL that provide for programing man-computer dialogues. An interactive dialogue application program is viewed as a sequence of frames and separate computational steps. PASCAL extensions allow the description of the items of information in each frame and the inclusion of behavior rules specifying the interactive dialogue.…

  6. Integrated Science through Computer-aided Experiments

    Directory of Open Access Journals (Sweden)

    Stanislav HOLEC

    2004-10-01

    Full Text Available The paper outlines curriculum development activities that have been done in science education in the Slovak Republic as a result of an international collaboration within the frame of the Leonardo da Vinci II pilot project Computerised Laboratory in Science and Technology Teaching - ``ComLab-SciTech''. The created teaching and learning materials include integration of science curricula in two meanings: an integration of knowledge and methodology of physics, chemistry and biology, as well as an integration of various true and virtual computerised methods of experiments. The materials contain suggestions for student investigative activities, in which life science processes are studied with the use of laboratory models.

  7. Sugarcane agricultural-industrial facilities and greenhouses integration

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Andres da [Estufas Agricolas Comercio e Assessoria Ltda. (EACEA), SP (Brazil)

    2012-07-01

    This chapter approaches Brazilian greenhouse market and technology, food market trends, integration of bioethanol distilleries with GH production, recovering CO{sub 2} from fermentation process, recovering low temperature energy, using vinasse and bagasse in GH processes, examples of integrated GH in the world, a tomato integrated GH study case, and a business model.

  8. Integrated computational and conceptual solutions for complex environmental information management

    Science.gov (United States)

    Rückemann, Claus-Peter

    2016-06-01

    This paper presents the recent results of the integration of computational and conceptual solutions for the complex case of environmental information management. The solution for the major goal of creating and developing long-term multi-disciplinary knowledge resources and conceptual and computational support was achieved by implementing and integrating key components. The key components are long-term knowledge resources providing required structures for universal knowledge creation, documentation, and preservation, universal multi-disciplinary and multi-lingual conceptual knowledge and classification, especially, references to Universal Decimal Classification (UDC), sustainable workflows for environmental information management, and computational support for dynamical use, processing, and advanced scientific computing with Integrated Information and Computing System (IICS) components and High End Computing (HEC) resources.

  9. Integration of Cloud resources in the LHCb Distributed Computing

    CERN Document Server

    Ubeda Garcia, Mario; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-01-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) – instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keepin...

  10. Mechatronic sensory system for computer integrated manufacturing

    CSIR Research Space (South Africa)

    Kumile, CM

    2007-05-01

    Full Text Available (CIM) systems plays an important role towards integrating such flexible systems. The requirement of fast and cheap design and redesign of manufacturing systems therefore is gaining in importance, considering not only the products and the physical...

  11. Numerical methods of computation of singular and hypersingular integrals

    Directory of Open Access Journals (Sweden)

    I. V. Boikov

    2001-01-01

    and technology one is faced with necessity of calculating different singular integrals. In analytical form calculation of singular integrals is possible only in unusual cases. Therefore approximate methods of singular integrals calculation are an active developing direction of computing in mathematics. This review is devoted to the optimal with respect to accuracy algorithms of the calculation of singular integrals with fixed singularity, Cauchy and Hilbert kernels, polysingular and many-dimensional singular integrals. The isolated section is devoted to the optimal with respect to accuracy algorithms of the calculation of the hypersingular integrals.

  12. Computation of overlap integrals over STOs with mathematica

    Science.gov (United States)

    Yükçü, S. A.; Yükçü, N.

    2017-02-01

    Overlap integrals which encountered in molecular structure calculations are the most basic of molecular integrals. Also, other molecular integrals can be expressed in terms of these integrals. Overlap integrals can be calculated by using Slater Type Orbitals (STOs). In this work, we develop algorithms for two-center overlap integrals which are calculated over the STOs in ellipsoidal coordinates and some auxiliary functions by S. M. Mekelleche's group. During the computation of this paper, Mathematica programming language has been used to produce algorithms. Numerical results for some quantum numbers are presented in the tables. Finally, our numerical results and others are compared, then some details of evaluation method are discussed.

  13. National electronic medical records integration on cloud computing system.

    Science.gov (United States)

    Mirza, Hebah; El-Masri, Samir

    2013-01-01

    Few Healthcare providers have an advanced level of Electronic Medical Record (EMR) adoption. Others have a low level and most have no EMR at all. Cloud computing technology is a new emerging technology that has been used in other industry and showed a great success. Despite the great features of Cloud computing, they haven't been utilized fairly yet in healthcare industry. This study presents an innovative Healthcare Cloud Computing system for Integrating Electronic Health Record (EHR). The proposed Cloud system applies the Cloud Computing technology on EHR system, to present a comprehensive EHR integrated environment.

  14. Fundamentals of power integrity for computer platforms and systems

    CERN Document Server

    DiBene, Joseph T

    2014-01-01

    An all-encompassing text that focuses on the fundamentals of power integrity Power integrity is the study of power distribution from the source to the load and the system level issues that can occur across it. For computer systems, these issues can range from inside the silicon to across the board and may egress into other parts of the platform, including thermal, EMI, and mechanical. With a focus on computer systems and silicon level power delivery, this book sheds light on the fundamentals of power integrity, utilizing the author's extensive background in the power integrity industry and un

  15. Equitable provision of social facilities for a range of settlements: guidelines and tools for integrated provision

    CSIR Research Space (South Africa)

    Green, Cheri A

    2013-09-01

    Full Text Available looks at equitable provision of social facilities for a range of settlements and offers guidelines and tools for integrated provision that incorporates the 1) development of fully provisioned quality living environments, 2) improvement of access...

  16. Performance of simulated flexible integrated gasification polygeneration facilities. Part A: A technical-energetic assessment

    NARCIS (Netherlands)

    Meerman, J.C.; Ramírez Ramírez, C.A.; Turkenburg, W.C.; Faaij, A.P.C.

    2011-01-01

    This article investigates technical possibilities and performances of flexible integrated gasification polygeneration (IG-PG) facilities equipped with CO2 capture for the near future. These facilities can produce electricity during peak hours, while switching to the production of chemicals during of

  17. Integrating Sustainability Programs into the Facilities Capital Planning Process

    Science.gov (United States)

    Buchanan, Susan

    2011-01-01

    With detailed information about the costs and benefits of potential green investments, educational facilities can effectively evaluate which initiatives will ultimately provide the greatest results over the short and long term. Based on its overall goals, every school, college, or university will have different values and therefore different…

  18. Integral Test Facility PKL: Experimental PWR Accident Investigation

    Directory of Open Access Journals (Sweden)

    Klaus Umminger

    2012-01-01

    Full Text Available Investigations of the thermal-hydraulic behavior of pressurized water reactors under accident conditions have been carried out in the PKL test facility at AREVA NP in Erlangen, Germany for many years. The PKL facility models the entire primary side and significant parts of the secondary side of a pressurized water reactor (PWR at a height scale of 1 : 1. Volumes, power ratings and mass flows are scaled with a ratio of 1 : 145. The experimental facility consists of 4 primary loops with circulation pumps and steam generators (SGs arranged symmetrically around the reactor pressure vessel (RPV. The investigations carried out encompass a very broad spectrum from accident scenario simulations with large, medium, and small breaks, over the investigation of shutdown procedures after a wide variety of accidents, to the systematic investigation of complex thermal-hydraulic phenomena. This paper presents a survey of test objectives and programs carried out to date. It also describes the test facility in its present state. Some important results obtained over the years with focus on investigations carried out since the beginning of the international cooperation are exemplarily discussed.

  19. Field Research Facility Data Integration Framework Data Management Plan: Survey Lines Dataset

    Science.gov (United States)

    2016-08-01

    clearinghouse tool using the Environmental Systems Research Institute (Esri) Geoportal technology . Once the XML metadata is loaded into the Metadata Manager ...ER D C/ CH L SR -1 6- 4 Coastal Ocean Data Systems Program Field Research Facility Data Integration Framework Data Management Plan...Systems Program ERDC/CHL SR-16-4 August 2016 Field Research Facility Data Integration Framework Data Management Plan Survey Lines Dataset Michael F

  20. Paradox of integration -- a computational model

    CERN Document Server

    Krawczyk, Malgorzata J

    2016-01-01

    The paradoxical aspect of integration of a social group has been highlighted by Peter Blau (Exchange and Power in Social Life, Wiley and Sons, 1964). During the integration process, the group members simultaneously compete for social status and play the role of the audience. Here we show that when the competition prevails over the desire of approval, a sharp transition breaks all friendly relations. However, as was described by Blau, people with high status are inclined to bother more with acceptance of others; this is achieved by praising others and revealing her/his own weak points. In our model, this action smooths the transition and improves interpersonal relations.

  1. Paradox of integration-A computational model

    Science.gov (United States)

    Krawczyk, Małgorzata J.; Kułakowski, Krzysztof

    2017-02-01

    The paradoxical aspect of integration of a social group has been highlighted by Blau (1964). During the integration process, the group members simultaneously compete for social status and play the role of the audience. Here we show that when the competition prevails over the desire of approval, a sharp transition breaks all friendly relations. However, as was described by Blau, people with high status are inclined to bother more with acceptance of others; this is achieved by praising others and revealing her/his own weak points. In our model, this action smooths the transition and improves interpersonal relations.

  2. Integrating Computer Literacy into the Vocational Curriculum.

    Science.gov (United States)

    Whatley, Myra N.

    In the summer of 1983, a centralized microcomputer laboratory was established on the Sarasota County Vocational Technical Center campus. Computers (IBM PCs, APPLE IIes, and TRS-80s) were installed and the teacher who had written the program proposal was chosen to teach the course of instruction to be offered by the laboratory. When classes began…

  3. Integrating Computer Concepts into Principles of Accounting.

    Science.gov (United States)

    Beck, Henry J.; Parrish, Roy James, Jr.

    A package of instructional materials for an undergraduate principles of accounting course at Danville Community College was developed based upon the following assumptions: (1) the principles of accounting student does not need to be able to write computer programs; (2) computerized accounting concepts should be presented in this course; (3)…

  4. Enhancing Mathematics Education through Computer Integration.

    Science.gov (United States)

    Friedman, Edward A., And Others

    1991-01-01

    The director of the three-year-old Stevens project, which addresses the implementation and use of computers for middle and secondary mathematics instruction, offers a progress report. Time for preparation, teacher training, and defining the changing role of the teacher are cited as key variables. Future activities are also previewed. (Author/JJK)

  5. Integrated computer control system architectural overview

    Energy Technology Data Exchange (ETDEWEB)

    Van Arsdall, P.

    1997-06-18

    This overview introduces the NIF Integrated Control System (ICCS) architecture. The design is abstract to allow the construction of many similar applications from a common framework. This summary lays the essential foundation for understanding the model-based engineering approach used to execute the design.

  6. Object-oriented models of functionally integrated computer systems

    OpenAIRE

    Kaasbøll, Jens

    1994-01-01

    Functional integration is the compatibility between the structure, culture and competence of an organization and its computer systems, specifically the availability of data and functionality and the consistency of user interfaces. Many people use more than one computer program in their work, and they experience problems relating to functional integration. Various solutions can be considered for different tasks and technology; e.g. to design a common userinterface shell for several application...

  7. Integrating interactive computational modeling in biology curricula.

    Science.gov (United States)

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  8. Integrating interactive computational modeling in biology curricula.

    Directory of Open Access Journals (Sweden)

    Tomáš Helikar

    2015-03-01

    Full Text Available While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  9. [The integration of disabled children in childcare facilities].

    Science.gov (United States)

    Tocqueville, Mélanie

    2011-01-01

    The issue of the integration of young disabled children from early childhood questions our society both in terms of the place of disability as well as with regard to parenting. It is urgent to reflect on the training of childcare professionals and on the need to link theories and practice, to ensure everyone participates in this integration.

  10. National Ignition Facility sub-system design requirements integrated safety systems SSDR 1.5.4

    Energy Technology Data Exchange (ETDEWEB)

    Reed, R.; VanArsdall, P.; Bliss, E.

    1996-09-01

    This System Design Requirement document establishes the performance, design, development, and test requirements for the Integrated Safety System, which is part of the NIF Integrated Computer Control System (ICCS).

  11. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, J; Sartirana, A

    2001-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on thei...

  12. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, Jose

    2010-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on the...

  13. Integrating Cloud-Computing-Specific Model into Aircraft Design

    Science.gov (United States)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  14. An integrated introduction to computer graphics and geometric modeling

    CERN Document Server

    Goldman, Ronald

    2009-01-01

    … this book may be the first book on geometric modelling that also covers computer graphics. In addition, it may be the first book on computer graphics that integrates a thorough introduction to 'freedom' curves and surfaces and to the mathematical foundations for computer graphics. … the book is well suited for an undergraduate course. … The entire book is very well presented and obviously written by a distinguished and creative researcher and educator. It certainly is a textbook I would recommend. …-Computer-Aided Design, 42, 2010… Many books concentrate on computer programming and soon beco

  15. Integrating Network Management for Cloud Computing Services

    Science.gov (United States)

    2015-06-01

    azure.microsoft.com/. 114 [16] Microsoft Azure ExpressRoute. http://azure.microsoft.com/en-us/ services/expressroute/. [17] Mobility and Networking...Marc Lobelle. The NAROS Ap- proach for IPv6 Multihoming with Traffic Engineering. In Quality for All, pages 112–121. Springer, 2003. [57] Jeffrey Dean...Networking Technologies, Services, and Protocols; Performance of Computer and Commu- nication Networks; Mobile and Wireless Communications Systems

  16. Integrated computer-aided retinal photocoagulation system

    Science.gov (United States)

    Barrett, Steven F.; Wright, Cameron H. G.; Oberg, Erik D.; Rockwell, Benjamin A.; Cain, Clarence P.; Jerath, Maya R.; Rylander, Henry G., III; Welch, Ashley J.

    1996-05-01

    Successful retinal tracking subsystem testing results in vivo on rhesus monkeys using an argon continuous wave laser and an ultra-short pulse laser are presented. Progress on developing an integrated robotic retinal laser surgery system is also presented. Several interesting areas of study have developed: (1) 'doughnut' shaped lesions that occur under certain combinations of laser power, spot size, and irradiation time complicating measurements of central lesion reflectance, (2) the optimal retinal field of view to achieve simultaneous tracking and lesion parameter control, and (3) a fully digital versus a hybrid analog/digital tracker using confocal reflectometry integrated system implementation. These areas are investigated in detail in this paper. The hybrid system warrants a separate presentation and appears in another paper at this conference.

  17. Integrating Mobile Robotics and Vision with Undergraduate Computer Science

    Science.gov (United States)

    Cielniak, G.; Bellotto, N.; Duckett, T.

    2013-01-01

    This paper describes the integration of robotics education into an undergraduate Computer Science curriculum. The proposed approach delivers mobile robotics as well as covering the closely related field of Computer Vision and is directly linked to the research conducted at the authors' institution. The paper describes the most relevant…

  18. Integrating Mobile Robotics and Vision with Undergraduate Computer Science

    Science.gov (United States)

    Cielniak, G.; Bellotto, N.; Duckett, T.

    2013-01-01

    This paper describes the integration of robotics education into an undergraduate Computer Science curriculum. The proposed approach delivers mobile robotics as well as covering the closely related field of Computer Vision and is directly linked to the research conducted at the authors' institution. The paper describes the most relevant details of…

  19. Computer integration in the curriculum: promises and problems

    NARCIS (Netherlands)

    Plomp, Tjeerd; Akker, van den Jan J.H.

    1988-01-01

    This discussion of the integration of computers into the curriculum begins by reviewing the results of several surveys conducted in the Netherlands and the United States which provide insight into the problems encountered by schools and teachers when introducing computers in education. Case studies

  20. Integration of computers in education: a curriculum perspective

    NARCIS (Netherlands)

    Plomp, Tjeerd

    1988-01-01

    This discussion of a major problem area in education--the curricular and implementation aspects of the application of the computer or new information technologies--focuses first on the use and integration of computers in existing courses or subjects in the curriculum, and defines some key terms. The

  1. An Integrated and Layered Architecture for Location-Aware Computing

    Institute of Scientific and Technical Information of China (English)

    MA Linbing; ZHANG Xinchang; TAO Haiyan

    2005-01-01

    This paper gives an overall introduction to the basic concept of LAC(location-aware computing) and its development status, puts forward an integrated location-aware computing architecture which is useful for designing the reasonable logical model of LBS(location-based service).Finally, a brief introduction is conducted on a LAC experimental prototype, which acts as a mobile urban tourism assistant.

  2. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    CERN Document Server

    Anisenkov, Alexey; The ATLAS collaboration

    2017-01-01

    AGIS is the information system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing (ADC) applications and services. In this note, we describe the evolution and the recent developments of AGIS functionalities.

  3. Computation of Surface Integrals of Curl Vector Fields

    Science.gov (United States)

    Hu, Chenglie

    2007-01-01

    This article presents a way of computing a surface integral when the vector field of the integrand is a curl field. Presented in some advanced calculus textbooks such as [1], the technique, as the author experienced, is simple and applicable. The computation is based on Stokes' theorem in 3-space calculus, and thus provides not only a means to…

  4. Integrating Computational Chemistry into a Course in Classical Thermodynamics

    Science.gov (United States)

    Martini, Sheridan R.; Hartzell, Cynthia J.

    2015-01-01

    Computational chemistry is commonly addressed in the quantum mechanics course of undergraduate physical chemistry curricula. Since quantum mechanics traditionally follows the thermodynamics course, there is a lack of curricula relating computational chemistry to thermodynamics. A method integrating molecular modeling software into a semester long…

  5. Integrating Mobile Robotics and Vision with Undergraduate Computer Science

    Science.gov (United States)

    Cielniak, G.; Bellotto, N.; Duckett, T.

    2013-01-01

    This paper describes the integration of robotics education into an undergraduate Computer Science curriculum. The proposed approach delivers mobile robotics as well as covering the closely related field of Computer Vision and is directly linked to the research conducted at the authors' institution. The paper describes the most relevant details of…

  6. An algorithm of computing inhomogeneous differential equations for definite integrals

    OpenAIRE

    Nakayama, Hiromasa; Nishiyama, Kenta

    2010-01-01

    We give an algorithm to compute inhomogeneous differential equations for definite integrals with parameters. The algorithm is based on the integration algorithm for $D$-modules by Oaku. Main tool in the algorithm is the Gr\\"obner basis method in the ring of differential operators.

  7. An Integrated Computer-Aided Approach for Environmental Studies

    DEFF Research Database (Denmark)

    Gani, Rafiqul; Chen, Fei; Jaksland, Cecilia;

    1997-01-01

    A general framework for an integrated computer-aided approach to solve process design, control, and environmental problems simultaneously is presented. Physicochemical properties and their relationships to the molecular structure play an important role in the proposed integrated approach. The scope...... and applicability of the integrated approach is highlighted through examples involving estimation of properties and environmental pollution prevention. The importance of mixture effects on some environmentally important properties is also demonstrated....

  8. A GIS Based Integrated Approach to Measure the Spatial Equity of Community Facilities of Bangladesh

    Directory of Open Access Journals (Sweden)

    Meher Nigar Neema

    2015-12-01

    Full Text Available The distribution of public facilities and their spatial equity is an important matter to be considered while planning public facilities. However, most of the studies in the literature have taken into consideration only a single type of facility while leaving other facilities unconsidered. In this paper an integrated spatial index for public facilities has been developed integrating GIS and spatial analysis models. The index measures the spatial equity based on the accessibility of 6 different types of public facilities for 5247 unions and 476 sub-districts of Bangladesh. Spatial autocorrelation techniques have been applied to understand the spatial pattern of accessibility. In fact it helps to understand the characteristics of spatial equity both for disaggregated and aggregated levels. It has been found that variation accessibilities to the facilities across the space are significant. Distribution of some facilities are spatially clustered to some particular areas means those areas are in an advantageous position in terms of accessibility while other areas are in a backward condition. The proposed index and the spatial autocorrelation will help to identify which areas should receive more priority in allocating particular types of public facilities in the future.

  9. Peta-scale QMC simulations on DOE leadership computing facilities

    Science.gov (United States)

    Kim, Jeongnim; Ab Initio Network Collaboration

    2014-03-01

    Continuum quantum Monte Carlo (QMC) has proved to be an invaluable tool for predicting the properties of matter from fundamental principles. Even with numerous innovations in methods, algorithms and codes, QMC simulations of realistic problems of 1000s and more electrons are demanding, requiring millions of core hours to achieve the target chemical accuracy. The multiple forms of parallelism afforded by QMC algorithms and high compute-to-communication ratio make them ideal candidates for acceleration in the multi/many-core paradigm. We have ported and tuned QMCPACK to recently deployed DOE doca-petaflop systems, Titan (Cray XK7 CPU/GPGPU) and Mira (IBM Blue Gene/Q). The efficiency gains through improved algorithms and architecture-specific tuning and, most importantly, the vast increase in computing powers have opened up opportunities to apply QMC at unprecedent scales, accuracy and time-to-solution. We present large-scale QMC simulations to study energetics of layered materials where vdW interactions play critical roles. Collaboration supported through the Predictive Theory and Modeling for Materials and Chemical Science program by the Basic Energy Science, Department of Energy.

  10. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and

  11. Computer Aided Engineering of Semiconductor Integrated Circuits

    Science.gov (United States)

    1976-04-01

    transistor opera tion; (4) theoretical invest! jations of carrifr mobli *!;y *"« inversion layer of an MOSFET; (5) mathematical investigations for high...satisfactory greLnt «Lh experiment. In time, the rapid groWth of se.r- oonduotor integrated circuit (IC, technology created ^ ^ °n" £or which this theory was...and Technology of Semiconductor Devices, John Wiley and Sons, Inc., N.Y. (1967). [2] S. K. Ghandi, The Theory and Practice of

  12. ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Peisert, Sean [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Davis, CA (United States); Potok, Thomas E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Jones, Todd [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-03

    At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues included research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the

  13. Development of integrated platform for computational material design

    Energy Technology Data Exchange (ETDEWEB)

    Kiyoshi, Matsubara; Kumi, Itai; Nobutaka, Nishikawa; Akifumi, Kato [Center for Computational Science and Engineering, Fuji Research Institute Corporation (Japan); Hideaki, Koike [Advance Soft Corporation (Japan)

    2003-07-01

    The goal of our project is to design and develop a problem-solving environment (PSE) that will help computational scientists and engineers develop large complicated application software and simulate complex phenomena by using networking and parallel computing. The integrated platform, which is designed for PSE in the Japanese national project of Frontier Simulation Software for Industrial Science, is defined by supporting the entire range of problem solving activity from program formulation and data setup to numerical simulation, data management, and visualization. A special feature of our integrated platform is based on a new architecture called TASK FLOW. It integrates the computational resources such as hardware and software on the network and supports complex and large-scale simulation. This concept is applied to computational material design and the project 'comprehensive research for modeling, analysis, control, and design of large-scale complex system considering properties of human being'. Moreover this system will provide the best solution for developing large and complicated software and simulating complex and large-scaled phenomena in computational science and engineering. A prototype has already been developed and the validation and verification of an integrated platform will be scheduled by using the prototype in 2003. In the validation and verification, fluid-structure coupling analysis system for designing an industrial machine will be developed on the integrated platform. As other examples of validation and verification, integrated platform for quantum chemistry and bio-mechanical system are planned.

  14. Integration of criticality alarm system at a fuel manufacturing facility

    Energy Technology Data Exchange (ETDEWEB)

    Longinov, M.; Pant, A. [Zircatec Precision Industries, Port Hope, Ontario (Canada)

    2005-07-01

    In response to the Power Uprate program at Bruce Power, Zircatec has committed to introduce, by Spring 2006 a new manufacturing line for the production of 43 element CANFLEX bundles containing Slightly Enriched Uranium (SEU) with a centre pin of blended dysprosia/urania (BDU). This is a new fuel design and is the first change in fuel design since the introduction of the current 37 element fuel over 20 years ago. As the primary fuel supplier to the reactor site that has chosen to utilize this new fuel design, Zircatec has agreed to manufacture and supply this new fuel at their facility in Port Hope, Ontario. Under this agreement, Zircatec is challenged with converting a fuel manufacturing facility to include the processing of enriched uranium. The challenge is to introduce the new concept of criticality control to a facility that traditionally does not have to deal with such a concept. One of the elements of the implementation is the criticality detection and alarm system - CIDAS. Since a criticality could go undetected by human senses, one of the methods of ensuring safety from radiation exposure in the event of a criticality is the installation of a criticality incident detection and alarm system. This early warning device could be the difference between low dose exposure and lethal exposure. This paper describes the challenges that Zircatec has faced with the installation of a criticality incident detection and alarm system. These challenges include determining the needs and requirements, determining appropriate specifications, selecting the right equipment, installing the equipment and training personnel in the operation of the new equipment. (author)

  15. Integral Monitored Retrievable Storage (MRS) Facility conceptual design report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1985-09-01

    In April 1985, the Department of Energy (DOE) selected the Clinch River site as its preferred site for the construction and operation of the monitored retrievable storage (MRS) facility (USDOE, 1985). In support of the DOE MRS conceptual design activity, available data describing the site have been gathered and analyzed. A composite geotechnical description of the Clinch River site has been developed and is presented herein. This report presents Clinch River site description data in the following sections: general site description, surface hydrologic characteristics, groundwater characteristics, geologic characteristics, vibratory ground motion, surface faulting, stability of subsurface materials, slope stability, and references. 48 refs., 35 figs., 6 tabs.

  16. On Integrating Computer Technology into Our Curriculum

    Institute of Scientific and Technical Information of China (English)

    Jianhua Bai

    2000-01-01

    @@ 1. Introduction When something new and exciting occurs in technology, some of us foreign language teachers tend to become too excited to notice the limitations of the new technology. It is no exception with the current use of multimedia and intemet technology in our curriculum. It is my belief that,without an adequate understanding of its limitations as well as its advantages, we may find ourselves disappointed as we move ahead in integrating the multimedia technology into our instruction in a meaningful way. Therefore, in this paper, I would like to start with the discussion of the limitation of multimedia technology and then discuss how we can make use of this new technology to improve our teaching and learning of Chinese as a foreign language.

  17. Engineering Task Plan for the Integrity Assessment Examination of Double Contained Receiver Tanks (DCRT) Catch Tanks and Ancillary facilities

    Energy Technology Data Exchange (ETDEWEB)

    BECKER, D.L.

    2000-05-23

    This Engineering Task Plan (ETP) presents the integrity assessment examination of three DCRTs, seven catch tanks, and two ancillary facilities located in the 200 East and West Areas of the Hanford Site. The integrity assessment examinations, as described in this ETP, will provide the necessary information to enable the independently qualified registered professional engineer (IQRPE) to assess the condition and integrity of these facilities. The plan is consistent with the Double-Shell Tank Waste Transfer Facilities Integrity Assessment Plan.

  18. Operational Circular nr 5 - October 2000 USE OF CERN COMPUTING FACILITIES

    CERN Multimedia

    Division HR

    2000-01-01

    New rules covering the use of CERN Computing facilities have been drawn up. All users of CERN’s computing facilites are subject to these rules, as well as to the subsidiary rules of use. The Computing Rules explicitly address your responsibility for taking reasonable precautions to protect computing equipment and accounts. In particular, passwords must not be easily guessed or obtained by others. Given the difficulty to completely separate work and personal use of computing facilities, the rules define under which conditions limited personal use is tolerated. For example, limited personal use of e-mail, news groups or web browsing is tolerated in your private time, provided CERN resources and your official duties are not adversely affected. The full conditions governing use of CERN’s computing facilities are contained in Operational Circular N° 5, which you are requested to read. Full details are available at : http://www.cern.ch/ComputingRules Copies of the circular are also available in the Divis...

  19. Integration of computer technology into the medical curriculum: the King's experience

    Directory of Open Access Journals (Sweden)

    Vickie Aitken

    1997-12-01

    Full Text Available Recently, there have been major changes in the requirements of medical education which have set the scene for the revision of medical curricula (Towle, 1991; GMC, 1993. As part of the new curriculum at King's, the opportunity has been taken to integrate computer technology into the course through Computer-Assisted Learning (CAL, and to train graduates in core IT skills. Although the use of computers in the medical curriculum has up to now been limited, recent studies have shown encouraging steps forward (see Boelen, 1995. One area where there has been particular interest is the use of notebook computers to allow students increased access to IT facilities (Maulitz et al, 1996.

  20. DARHT : integration of shielding design and analysis with facility design /

    Energy Technology Data Exchange (ETDEWEB)

    Boudrie, R. L. (Richard L.); Brown, T. H. (Thomas H.); Gilmore, W. E. (Walter E.); Downing, J. N. (James N.), Jr.; Hack, Alan; McClure, D. A. (Donald A.); Nelson, C. A. (Christine A.); Wadlinger, E. Alan; Zumbro, M. V. (Martha V.)

    2002-01-01

    The design of the interior portions of the Dual Axis Radiographic Hydrodynamic Test (DARHT) Facility incorporated shielding and controls from the beginning of the installation of the Accelerators. The purpose of the design and analysis was to demonstrate the adequacy of shielding or to determine the need for additional shielding or controls. Two classes of events were considered: (1) routine operation defined as the annual production of 10,000 2000-ns pulses of electrons at a nominal energy of 20 MeV, some of which are converted to the x-ray imaging beam consisting of four nominal 60-ns pulses over the 2000-ns time frame, and (2) accident case defined as up to 100 2000-ns pulses of electrons accidentally impinging on some metallic surface, thereby producing x rays. Several locations for both classes of events were considered inside and outside of the accelerator hall buildings. The analysis method consisted of the definition of a source term for each case studied and the definition of a model of the shielding and equipment present between the source and the dose areas. A minimal model of the fixed existing or proposed shielding and equipment structures was used for a first approximation. If the resulting dose from the first approximation was below the design goal (1 rem/yr for routine operations, 5 rem for accident cases), then no further investigations were performed. If the result of the first approximation was above our design goals, the model was refined to include existing or proposed shielding and equipment. In some cases existing shielding and equipment were adequate to meet our goals and in some cases additional shielding was added or administrative controls were imposed to protect the workers. It is expected that the radiation shielding design, exclusion area designations, and access control features, will result in low doses to personnel at the DARHT Facility.

  1. Integrated Multiscale Modeling of Molecular Computing Devices

    Energy Technology Data Exchange (ETDEWEB)

    Jerzy Bernholc

    2011-02-03

    Nanoscience has been one of the major research focuses of the U.S. and much of the world for the past decade, in part because of its promise to revolutionize many fields, including materials, medicine, and electronics. At the heart of this promise is the fact that nanostructured materials can behave radically differently than their macroscopic counterparts (e.g., bulk gold is such an inert metal that it has found applications in such diverse fields as jewelry, biomedical implants and dentistry, whereas gold nanoparticles are highly reactive and are thus useful as nanocatalysts) and have properties that are tunable due to a strong dependence on the size and surface area of the nanostructure. Thus, nanoscience offers a remarkable opportunity to develop new functional systems built around nanostructured materials with unusual and tunable properties and functionality. The transition from nanoscience to nanotechnology becomes possible when nanostructured systems can be made reproducibly by processes that can be implemented on a large scale. The microelectronics industry is one example of an industry that has evolved into the realm of nanotechnology, since the exponential reduction in feature size in computer chips has resulted in feature sizes now under 50nm (45nm in production, 32nm demonstrated; feature size has been going down by a factor of approximately 1/{radical}2 every 18 months as chip density has doubled every 18 months according to Moore's law). Silicon-based microelectronics relies on etching features into a single-crystal silicon substrate by photolithography. As the feature size of silicon-based microelectronics continues to decrease, the continuation of Moore's law to below 20nm feature sizes is being questioned, due to limitations in both the physics of the transistors (leading to unacceptable power dissipation) and doubts about the scalability of top-down photolithography-based manufacturing to such small sizes. There is no doubt that

  2. Integrated Multiscale Modeling of Molecular Computing Devices

    Energy Technology Data Exchange (ETDEWEB)

    Gregory Beylkin

    2012-03-23

    Significant advances were made on all objectives of the research program. We have developed fast multiresolution methods for performing electronic structure calculations with emphasis on constructing efficient representations of functions and operators. We extended our approach to problems of scattering in solids, i.e. constructing fast algorithms for computing above the Fermi energy level. Part of the work was done in collaboration with Robert Harrison and George Fann at ORNL. Specific results (in part supported by this grant) are listed here and are described in greater detail. (1) We have implemented a fast algorithm to apply the Green's function for the free space (oscillatory) Helmholtz kernel. The algorithm maintains its speed and accuracy when the kernel is applied to functions with singularities. (2) We have developed a fast algorithm for applying periodic and quasi-periodic, oscillatory Green's functions and those with boundary conditions on simple domains. Importantly, the algorithm maintains its speed and accuracy when applied to functions with singularities. (3) We have developed a fast algorithm for obtaining and applying multiresolution representations of periodic and quasi-periodic Green's functions and Green's functions with boundary conditions on simple domains. (4) We have implemented modifications to improve the speed of adaptive multiresolution algorithms for applying operators which are represented via a Gaussian expansion. (5) We have constructed new nearly optimal quadratures for the sphere that are invariant under the icosahedral rotation group. (6) We obtained new results on approximation of functions by exponential sums and/or rational functions, one of the key methods that allows us to construct separated representations for Green's functions. (7) We developed a new fast and accurate reduction algorithm for obtaining optimal approximation of functions by exponential sums and/or their rational representations.

  3. The CT Scanner Facility at Stellenbosch University: An open access X-ray computed tomography laboratory

    Science.gov (United States)

    du Plessis, Anton; le Roux, Stephan Gerhard; Guelpa, Anina

    2016-10-01

    The Stellenbosch University CT Scanner Facility is an open access laboratory providing non-destructive X-ray computed tomography (CT) and a high performance image analysis services as part of the Central Analytical Facilities (CAF) of the university. Based in Stellenbosch, South Africa, this facility offers open access to the general user community, including local researchers, companies and also remote users (both local and international, via sample shipment and data transfer). The laboratory hosts two CT instruments, i.e. a micro-CT system, as well as a nano-CT system. A workstation-based Image Analysis Centre is equipped with numerous computers with data analysis software packages, which are to the disposal of the facility users, along with expert supervision, if required. All research disciplines are accommodated at the X-ray CT laboratory, provided that non-destructive analysis will be beneficial. During its first four years, the facility has accommodated more than 400 unique users (33 in 2012; 86 in 2013; 154 in 2014; 140 in 2015; 75 in first half of 2016), with diverse industrial and research applications using X-ray CT as means. This paper summarises the existence of the laboratory's first four years by way of selected examples, both from published and unpublished projects. In the process a detailed description of the capabilities and facilities available to users is presented.

  4. MIP models for connected facility location: A theoretical and computational study☆

    Science.gov (United States)

    Gollowitzer, Stefan; Ljubić, Ivana

    2011-01-01

    This article comprises the first theoretical and computational study on mixed integer programming (MIP) models for the connected facility location problem (ConFL). ConFL combines facility location and Steiner trees: given a set of customers, a set of potential facility locations and some inter-connection nodes, ConFL searches for the minimum-cost way of assigning each customer to exactly one open facility, and connecting the open facilities via a Steiner tree. The costs needed for building the Steiner tree, facility opening costs and the assignment costs need to be minimized. We model ConFL using seven compact and three mixed integer programming formulations of exponential size. We also show how to transform ConFL into the Steiner arborescence problem. A full hierarchy between the models is provided. For two exponential size models we develop a branch-and-cut algorithm. An extensive computational study is based on two benchmark sets of randomly generated instances with up to 1300 nodes and 115,000 edges. We empirically compare the presented models with respect to the quality of obtained bounds and the corresponding running time. We report optimal values for all but 16 instances for which the obtained gaps are below 0.6%. PMID:25009366

  5. MIP models for connected facility location: A theoretical and computational study.

    Science.gov (United States)

    Gollowitzer, Stefan; Ljubić, Ivana

    2011-02-01

    This article comprises the first theoretical and computational study on mixed integer programming (MIP) models for the connected facility location problem (ConFL). ConFL combines facility location and Steiner trees: given a set of customers, a set of potential facility locations and some inter-connection nodes, ConFL searches for the minimum-cost way of assigning each customer to exactly one open facility, and connecting the open facilities via a Steiner tree. The costs needed for building the Steiner tree, facility opening costs and the assignment costs need to be minimized. We model ConFL using seven compact and three mixed integer programming formulations of exponential size. We also show how to transform ConFL into the Steiner arborescence problem. A full hierarchy between the models is provided. For two exponential size models we develop a branch-and-cut algorithm. An extensive computational study is based on two benchmark sets of randomly generated instances with up to 1300 nodes and 115,000 edges. We empirically compare the presented models with respect to the quality of obtained bounds and the corresponding running time. We report optimal values for all but 16 instances for which the obtained gaps are below 0.6%.

  6. The CT Scanner Facility at Stellenbosch University: An open access X-ray computed tomography laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Plessis, Anton du, E-mail: anton2@sun.ac.za [CT Scanner Facility, Central Analytical Facilities, Stellenbosch University, Stellenbosch (South Africa); Physics Department, Stellenbosch University, Stellenbosch (South Africa); Roux, Stephan Gerhard le, E-mail: lerouxsg@sun.ac.za [CT Scanner Facility, Central Analytical Facilities, Stellenbosch University, Stellenbosch (South Africa); Guelpa, Anina, E-mail: aninag@sun.ac.za [CT Scanner Facility, Central Analytical Facilities, Stellenbosch University, Stellenbosch (South Africa)

    2016-10-01

    The Stellenbosch University CT Scanner Facility is an open access laboratory providing non-destructive X-ray computed tomography (CT) and a high performance image analysis services as part of the Central Analytical Facilities (CAF) of the university. Based in Stellenbosch, South Africa, this facility offers open access to the general user community, including local researchers, companies and also remote users (both local and international, via sample shipment and data transfer). The laboratory hosts two CT instruments, i.e. a micro-CT system, as well as a nano-CT system. A workstation-based Image Analysis Centre is equipped with numerous computers with data analysis software packages, which are to the disposal of the facility users, along with expert supervision, if required. All research disciplines are accommodated at the X-ray CT laboratory, provided that non-destructive analysis will be beneficial. During its first four years, the facility has accommodated more than 400 unique users (33 in 2012; 86 in 2013; 154 in 2014; 140 in 2015; 75 in first half of 2016), with diverse industrial and research applications using X-ray CT as means. This paper summarises the existence of the laboratory’s first four years by way of selected examples, both from published and unpublished projects. In the process a detailed description of the capabilities and facilities available to users is presented.

  7. National Ignition Facility sub-system design requirements integrated timing system SSDR 1.5.3

    Energy Technology Data Exchange (ETDEWEB)

    Wiedwald, J.; Van Aersau, P.; Bliss, E.

    1996-08-26

    This System Design Requirement document establishes the performance, design, development, and test requirements for the Integrated Timing System, WBS 1.5.3 which is part of the NIF Integrated Computer Control System (ICCS). The Integrated Timing System provides all temporally-critical hardware triggers to components and equipment in other NIF systems.

  8. ARN Integrated Retail Module (IRM) System at Ft. Bliss - Central Issue Facility (CIF) Local Tariff

    Science.gov (United States)

    2008-03-14

    for issue to deploying soldiers, a Central Initial Issue Point ( CIIP ) for issue to recruits, and a 3D Whole Body Scanner for obtaining body...Apparel Research Network (ARN); Integrated Retail Module (IRM), Clothing Initial Issue Point ( CIIP ); Central Issue Facility (CIF); Supply Chain...automated support to a Central Issue Facility for issue to deploying soldiers, a Central Initial Issue Point ( CIIP ) for issue to recruits, and a 3D

  9. Integrated Electrical and Thermal Grid Facility - Testing of Future Microgrid Technologies

    OpenAIRE

    2015-01-01

    This paper describes the Experimental Power Grid Centre (EPGC) microgrid test facility, which was developed to enable research, development and testing for a wide range of distributed generation and microgrid technologies. The EPGC microgrid facility comprises a integrated electrical and thermal grid with a flexible and configurable architecture, and includes various distributed energy resources and emulators, such as generators, renewable, energy storage technologies and programmable load ba...

  10. An Integrated Intuitionistic Fuzzy Multi Criteria Decision Making Method for Facility Location Selection

    OpenAIRE

    BORAN, Fatih

    2011-01-01

    The facility location selection, which is one of the important activities in strategic planning for a wide range of private and public companies, is a multi-criteria decision making problem including both quantitative and qualitative criteria. Traditional methods for facility location selection can not be effectively handled because information can not be represented by precise information under many conditions. This paper proposes the integration of intuitionistic fuzzy preference relation a...

  11. Artificial and Computational Intelligence in Games: Integration (Dagstuhl Seminar 15051)

    OpenAIRE

    Lucas, Simon M.; Mateas, Michael; Preuss, Mike; Spronck, Pieter; Togelius, Julian

    2015-01-01

    This report documents Dagstuhl Seminar 15051 "Artificial and Computational Intelligence in Games: Integration". The focus of the seminar was on the computational techniques used to create, enhance, and improve the experiences of humans interacting with and within virtual environments. Different researchers in this field have different goals, including developing and testing new AI methods, creating interesting and believable non-player characters, improving the game production pipeline, study...

  12. Integrated Framework for Patient Safety and Energy Efficiency in Healthcare Facilities Retrofit Projects.

    Science.gov (United States)

    Mohammadpour, Atefeh; Anumba, Chimay J; Messner, John I

    2016-07-01

    There is a growing focus on enhancing energy efficiency in healthcare facilities, many of which are decades old. Since replacement of all aging healthcare facilities is not economically feasible, the retrofitting of these facilities is an appropriate path, which also provides an opportunity to incorporate energy efficiency measures. In undertaking energy efficiency retrofits, it is vital that the safety of the patients in these facilities is maintained or enhanced. However, the interactions between patient safety and energy efficiency have not been adequately addressed to realize the full benefits of retrofitting healthcare facilities. To address this, an innovative integrated framework, the Patient Safety and Energy Efficiency (PATSiE) framework, was developed to simultaneously enhance patient safety and energy efficiency. The framework includes a step -: by -: step procedure for enhancing both patient safety and energy efficiency. It provides a structured overview of the different stages involved in retrofitting healthcare facilities and improves understanding of the intricacies associated with integrating patient safety improvements with energy efficiency enhancements. Evaluation of the PATSiE framework was conducted through focus groups with the key stakeholders in two case study healthcare facilities. The feedback from these stakeholders was generally positive, as they considered the framework useful and applicable to retrofit projects in the healthcare industry.

  13. National Ignition Facility sub-system design requirements computer system SSDR 1.5.1

    Energy Technology Data Exchange (ETDEWEB)

    Spann, J.; VanArsdall, P.; Bliss, E.

    1996-09-05

    This System Design Requirement document establishes the performance, design, development and test requirements for the Computer System, WBS 1.5.1 which is part of the NIF Integrated Computer Control System (ICCS). This document responds directly to the requirements detailed in ICCS (WBS 1.5) which is the document directly above.

  14. An Integrative Study on Bioinformatics Computing Concepts, Issues and Problems

    Directory of Open Access Journals (Sweden)

    Muhammad Zakarya

    2011-11-01

    Full Text Available Bioinformatics is the permutation and mishmash of biological science and 4IT. The discipline covers every computational tools and techniques used to administer, examine and manipulate huge sets of biological statistics. The discipline also helps in creation of databases to store up and supervise biological statistics, improvement of computer algorithms to find out relations in these databases and use of computer tools for the study and understanding of biological information, including DNA, RNA, protein sequences, gene expression profiles, protein structures, and biochemical pathways. The study of this paper implements an integrative solution. As we know that solution to a problem in a specific discipline may be a solution to another problem in a different discipline. For example entropy that has been rented from physical sciences is solution to most of the problems and issues in computer science. Another example is bioinformatics, where computing method and applications are implemented over biological information. This paper shows an initiative step towards that and will discuss upon the needs for integration of multiple discipline and sciences. Similarly green chemistry gives birth to a new kind of computing i.e. green computing. In next versions of this paper we will study biological fuel cell and will discuss to develop a mobile battery that will be life time charged using the concepts of biological fuel cell. Another issue that we are going to discuss in our series is brain tumor detection. This paper is a review on BI i.e. bioinformatics to start with.

  15. Integration of Cloud resources in the LHCb Distributed Computing

    Science.gov (United States)

    Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-06-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  16. Soft computing integrating evolutionary, neural, and fuzzy systems

    CERN Document Server

    Tettamanzi, Andrea

    2001-01-01

    Soft computing encompasses various computational methodologies, which, unlike conventional algorithms, are tolerant of imprecision, uncertainty, and partial truth. Soft computing technologies offer adaptability as a characteristic feature and thus permit the tracking of a problem through a changing environment. Besides some recent developments in areas like rough sets and probabilistic networks, fuzzy logic, evolutionary algorithms, and artificial neural networks are core ingredients of soft computing, which are all bio-inspired and can easily be combined synergetically. This book presents a well-balanced integration of fuzzy logic, evolutionary computing, and neural information processing. The three constituents are introduced to the reader systematically and brought together in differentiated combinations step by step. The text was developed from courses given by the authors and offers numerous illustrations as

  17. Application of Computer Integration Technology for Fire Safety Analysis

    Institute of Scientific and Technical Information of China (English)

    SHI Jianyong; LI Yinqing; CHEN Huchuan

    2008-01-01

    With the development of information technology, the fire safety assessment of whole structure or region based on the computer simulation has become a hot topic. However, traditionally, the concemed studies are performed separately for different objectives and difficult to perform an overall evaluation. A new multi-dimensional integration model and methodology for fire safety assessment were presented and two newly developed integrated systems were introduced to demonstrate the function of integration simulation technology in this paper. The first one is the analysis on the fire-resistant behaviors of whole structure under real fire loads. The second one is the study on fire evaluation and emergency rescue of campus based on geography information technology (GIS). Some practical examples are presented to illuminate the advan-tages of computer integration technology on fire safety assessment and emphasize some problems in the simulation. The results show that the multi-dimensional integration model offers a new way and platform for the integrating fire safety assessment of whole structure or region, and the integrated software developed is the useful engineering tools for cost-saving and safe design.

  18. Computer Technology Integration and Student Learning: Barriers and Promise

    Science.gov (United States)

    Keengwe, Jared; Onchwari, Grace; Wachira, Patrick

    2008-01-01

    Political and institutional support has enabled many institutions of learning to spend millions of dollars to acquire educational computing tools (Ficklen and Muscara, "Am Educ" 25(3):22-29, 2001) that have not been effectively integrated into the curriculum. While access to educational technology tools has remarkably improved in most schools,…

  19. Computer integrated manufacturing in the chemical industry : Theory & practice

    NARCIS (Netherlands)

    Ashayeri, J.; Teelen, A.; Selen, W.J.

    1995-01-01

    This paper addresses the possibilities of implementing Computer Integrated Manufacturing in the process industry, and the chemical industry in particular. After presenting some distinct differences of the process industry in relation to discrete manufacturing, a number of focal points are discussed.

  20. A Partnership Project: Integrating Computer Technology and Orff-Schulwerk.

    Science.gov (United States)

    Woody, Robert H.; Fredrickson, Julie M.

    2000-01-01

    Describes an alternative approach for general music educators wanting to study new instructional strategies in which a classroom teacher and university educator collaborated to explore the integration of computer technology with Orff-Schulwerk in second- and third-grade music classes. Discusses the project and two of its technology-assisted…

  1. Medical robotics and computer-integrated interventional medicine

    Science.gov (United States)

    Taylor, Russell H.

    2012-02-01

    Computer-Integrated Interventional Medicine (CIIM) promises to have a profound impact on health care in the next 20 years, much as and for many of the same reasons that the marriage of computers and information processing methods with other technology have had on manufacturing, transportation, and other sectors of our society. Our basic premise is that the steps of creating patient-specific computational models, using these models for planning, registering the models and plans with the actual patient in the operating room, and using this information with appropriate technology to assist in carrying out and monitoring the intervention are best viewed as part of a complete patient-specific intervention process that occurs over many time scales. Further, the information generated in computer-integrated interventions can be captured and analyzed statistically to improve treatment processes. This paper will explore these themes briefly, using examples drawn from our work at the Engineering Research Center for Computer-Integrated Surgical Systems and Technology (CISST ERC).

  2. A Fruitful Collaboration between ESO and the Max Planck Computing and Data Facility

    Science.gov (United States)

    Fourniol, N.; Zampieri, S.; Panea, M.

    2016-06-01

    The ESO Science Archive Facility (SAF), contains all La Silla Paranal Observatory raw data, as well as, more recently introduced, processed data created at ESO with state-of-the-art pipelines or returned by the astronomical community. The SAF has been established for over 20 years and its current holding exceeds 700 terabytes. An overview of the content of the SAF and the preservation of its content is provided. The latest development to ensure the preservation of the SAF data, provision of an independent backup copy of the whole SAF at the Max Planck Computing and Data Facility in Garching, is described.

  3. Framework for Integrating Safety, Operations, Security, and Safeguards in the Design and Operation of Nuclear Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Darby, John L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Horak, Karl Emanuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); LaChance, Jeffrey L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Tolk, Keith Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Whitehead, Donnie Wayne [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2007-10-01

    The US is currently on the brink of a nuclear renaissance that will result in near-term construction of new nuclear power plants. In addition, the Department of Energy’s (DOE) ambitious new Global Nuclear Energy Partnership (GNEP) program includes facilities for reprocessing spent nuclear fuel and reactors for transmuting safeguards material. The use of nuclear power and material has inherent safety, security, and safeguards (SSS) concerns that can impact the operation of the facilities. Recent concern over terrorist attacks and nuclear proliferation led to an increased emphasis on security and safeguard issues as well as the more traditional safety emphasis. To meet both domestic and international requirements, nuclear facilities include specific SSS measures that are identified and evaluated through the use of detailed analysis techniques. In the past, these individual assessments have not been integrated, which led to inefficient and costly design and operational requirements. This report provides a framework for a new paradigm where safety, operations, security, and safeguards (SOSS) are integrated into the design and operation of a new facility to decrease cost and increase effectiveness. Although the focus of this framework is on new nuclear facilities, most of the concepts could be applied to any new, high-risk facility.

  4. Approximation method to compute domain related integrals in structural studies

    Science.gov (United States)

    Oanta, E.; Panait, C.; Raicu, A.; Barhalescu, M.; Axinte, T.

    2015-11-01

    Various engineering calculi use integral calculus in theoretical models, i.e. analytical and numerical models. For usual problems, integrals have mathematical exact solutions. If the domain of integration is complicated, there may be used several methods to calculate the integral. The first idea is to divide the domain in smaller sub-domains for which there are direct calculus relations, i.e. in strength of materials the bending moment may be computed in some discrete points using the graphical integration of the shear force diagram, which usually has a simple shape. Another example is in mathematics, where the surface of a subgraph may be approximated by a set of rectangles or trapezoids used to calculate the definite integral. The goal of the work is to introduce our studies about the calculus of the integrals in the transverse section domains, computer aided solutions and a generalizing method. The aim of our research is to create general computer based methods to execute the calculi in structural studies. Thus, we define a Boolean algebra which operates with ‘simple’ shape domains. This algebraic standpoint uses addition and subtraction, conditioned by the sign of every ‘simple’ shape (-1 for the shapes to be subtracted). By ‘simple’ shape or ‘basic’ shape we define either shapes for which there are direct calculus relations, or domains for which their frontiers are approximated by known functions and the according calculus is carried out using an algorithm. The ‘basic’ shapes are linked to the calculus of the most significant stresses in the section, refined aspect which needs special attention. Starting from this idea, in the libraries of ‘basic’ shapes, there were included rectangles, ellipses and domains whose frontiers are approximated by spline functions. The domain triangularization methods suggested that another ‘basic’ shape to be considered is the triangle. The subsequent phase was to deduce the exact relations for the

  5. Contribution of sublinear and supralinear dendritic integration to neuronal computations.

    Science.gov (United States)

    Tran-Van-Minh, Alexandra; Cazé, Romain D; Abrahamsson, Therése; Cathala, Laurence; Gutkin, Boris S; DiGregorio, David A

    2015-01-01

    Nonlinear dendritic integration is thought to increase the computational ability of neurons. Most studies focus on how supralinear summation of excitatory synaptic responses arising from clustered inputs within single dendrites result in the enhancement of neuronal firing, enabling simple computations such as feature detection. Recent reports have shown that sublinear summation is also a prominent dendritic operation, extending the range of subthreshold input-output (sI/O) transformations conferred by dendrites. Like supralinear operations, sublinear dendritic operations also increase the repertoire of neuronal computations, but feature extraction requires different synaptic connectivity strategies for each of these operations. In this article we will review the experimental and theoretical findings describing the biophysical determinants of the three primary classes of dendritic operations: linear, sublinear, and supralinear. We then review a Boolean algebra-based analysis of simplified neuron models, which provides insight into how dendritic operations influence neuronal computations. We highlight how neuronal computations are critically dependent on the interplay of dendritic properties (morphology and voltage-gated channel expression), spiking threshold and distribution of synaptic inputs carrying particular sensory features. Finally, we describe how global (scattered) and local (clustered) integration strategies permit the implementation of similar classes of computations, one example being the object feature binding problem.

  6. Contribution of sublinear and supralinear dendritic integration to neuronal computations

    Directory of Open Access Journals (Sweden)

    Alexandra eTran-Van-Minh

    2015-03-01

    Full Text Available Nonlinear dendritic integration is thought to increase the computational ability of neurons. Most studies focus on how supralinear summation of excitatory synaptic responses arising from clustered inputs within single dendrites result in the enhancement of neuronal firing, enabling simple computations such as feature detection. Recent reports have shown that sublinear summation is also a prominent dendritic operation, extending the range of subthreshold input-output transformations conferred by dendrites. Like supralinear operations, sublinear dendritic operations also increase the repertoire of neuronal computations, but feature extraction requires different synaptic connectivity strategies for each of these operations. In this article we will review the experimental and theoretical findings describing the biophysical determinants of the three primary classes of dendritic operations: linear, sublinear, and supralinear. We then review a Boolean algebra-based analysis of simplified neuron models, which provides insight into how dendritic operations influence neuronal computations. We highlight how neuronal computations are critically dependent on the interplay of dendritic properties (morphology and voltage-gated channel expression, spiking threshold and distribution of synaptic inputs carrying particular sensory features. Finally, we describe how global (scattered and local (clustered integration strategies permit the implementation of similar classes of computations, one example being the object feature binding problem.

  7. Validation of an integral conceptual model of frailty in older residents of assisted living facilities

    NARCIS (Netherlands)

    Gobbens, Robbert J J; Krans, Anita; van Assen, Marcel A L M|info:eu-repo/dai/nl/407629971

    2015-01-01

    Objective: The aim of this cross-sectional study was to examine the validity of an integral model of the associations between life-course determinants, disease(s), frailty, and adverse outcomes in older persons who are resident in assisted living facilities. Methods: Between June 2013 and May 2014

  8. Validation of an integral conceptual model of frailty in older residents of assisted living facilities

    NARCIS (Netherlands)

    Gobbens, R.J.J.; Krans, A.; van Assen, M.A.L.M.

    2015-01-01

    Objective The aim of this cross-sectional study was to examine the validity of an integral model of the associations between life-course determinants, disease(s), frailty, and adverse outcomes in older persons who are resident in assisted living facilities. Methods Between June 2013 and May 2014

  9. Functional Analysis and Preliminary Specifications for a Single Integrated Central Computer System for Secondary Schools and Junior Colleges. A Feasibility and Preliminary Design Study. Interim Report.

    Science.gov (United States)

    Computation Planning, Inc., Bethesda, MD.

    A feasibility analysis of a single integrated central computer system for secondary schools and junior colleges finds that a central computing facility capable of serving 50 schools with a total enrollment of 100,000 students is feasible at a cost of $18 per student per year. The recommended system is a multiprogrammed-batch operation. Preliminary…

  10. Computer graphics application in the engineering design integration system

    Science.gov (United States)

    Glatt, C. R.; Abel, R. W.; Hirsch, G. N.; Alford, G. E.; Colquitt, W. N.; Stewart, W. A.

    1975-01-01

    The computer graphics aspect of the Engineering Design Integration (EDIN) system and its application to design problems were discussed. Three basic types of computer graphics may be used with the EDIN system for the evaluation of aerospace vehicles preliminary designs: offline graphics systems using vellum-inking or photographic processes, online graphics systems characterized by direct coupled low cost storage tube terminals with limited interactive capabilities, and a minicomputer based refresh terminal offering highly interactive capabilities. The offline line systems are characterized by high quality (resolution better than 0.254 mm) and slow turnaround (one to four days). The online systems are characterized by low cost, instant visualization of the computer results, slow line speed (300 BAUD), poor hard copy, and the early limitations on vector graphic input capabilities. The recent acquisition of the Adage 330 Graphic Display system has greatly enhanced the potential for interactive computer aided design.

  11. Multi-objective reverse logistics model for integrated computer waste management.

    Science.gov (United States)

    Ahluwalia, Poonam Khanijo; Nema, Arvind K

    2006-12-01

    This study aimed to address the issues involved in the planning and design of a computer waste management system in an integrated manner. A decision-support tool is presented for selecting an optimum configuration of computer waste management facilities (segregation, storage, treatment/processing, reuse/recycle and disposal) and allocation of waste to these facilities. The model is based on an integer linear programming method with the objectives of minimizing environmental risk as well as cost. The issue of uncertainty in the estimated waste quantities from multiple sources is addressed using the Monte Carlo simulation technique. An illustrated example of computer waste management in Delhi, India is presented to demonstrate the usefulness of the proposed model and to study tradeoffs between cost and risk. The results of the example problem show that it is possible to reduce the environmental risk significantly by a marginal increase in the available cost. The proposed model can serve as a powerful tool to address the environmental problems associated with exponentially growing quantities of computer waste which are presently being managed using rudimentary methods of reuse, recovery and disposal by various small-scale vendors.

  12. Computational investigations of low-emission burner facilities for char gas burning in a power boiler

    Science.gov (United States)

    Roslyakov, P. V.; Morozov, I. V.; Zaychenko, M. N.; Sidorkin, V. T.

    2016-04-01

    Various variants for the structure of low-emission burner facilities, which are meant for char gas burning in an operating TP-101 boiler of the Estonia power plant, are considered. The planned increase in volumes of shale reprocessing and, correspondingly, a rise in char gas volumes cause the necessity in their cocombustion. In this connection, there was a need to develop a burner facility with a given capacity, which yields effective char gas burning with the fulfillment of reliability and environmental requirements. For this purpose, the burner structure base was based on the staging burning of fuel with the gas recirculation. As a result of the preliminary analysis of possible structure variants, three types of early well-operated burner facilities were chosen: vortex burner with the supply of recirculation gases into the secondary air, vortex burner with the baffle supply of recirculation gases between flows of the primary and secondary air, and burner facility with the vortex pilot burner. Optimum structural characteristics and operation parameters were determined using numerical experiments. These experiments using ANSYS CFX bundled software of computational hydrodynamics were carried out with simulation of mixing, ignition, and burning of char gas. Numerical experiments determined the structural and operation parameters, which gave effective char gas burning and corresponded to required environmental standard on nitrogen oxide emission, for every type of the burner facility. The burner facility for char gas burning with the pilot diffusion burner in the central part was developed and made subject to computation results. Preliminary verification nature tests on the TP-101 boiler showed that the actual content of nitrogen oxides in burner flames of char gas did not exceed a claimed concentration of 150 ppm (200 mg/m3).

  13. Integration and use of Microgravity Research Facility: Lessons learned by the crystals by vapor transport experiment and Space Experiments Facility programs

    Science.gov (United States)

    Heizer, Barbara L.

    1992-01-01

    The Crystals by Vapor Transport Experiment (CVTE) and Space Experiments Facility (SEF) are materials processing facilities designed and built for use on the Space Shuttle mid deck. The CVTE was built as a commercial facility owned by the Boeing Company. The SEF was built under contract to the UAH Center for Commercial Development of Space (CCDS). Both facilities include up to three furnaces capable of reaching 850 C minimum, stand-alone electronics and software, and independent cooling control. In addition, the CVTE includes a dedicated stowage locker for cameras, a laptop computer, and other ancillary equipment. Both systems are designed to fly in a Middeck Accommodations Rack (MAR), though the SEF is currently being integrated into a Spacehab rack. The CVTE hardware includes two transparent furnaces capable of achieving temperatures in the 850 to 870 C range. The transparent feature allows scientists/astronauts to directly observe and affect crystal growth both on the ground and in space. Cameras mounted to the rack provide photodocumentation of the crystal growth. The basic design of the furnace allows for modification to accommodate techniques other than vapor crystal growth. Early in the CVTE program, the decision was made to assign a principal scientist to develop the experiment plan, affect the hardware/software design, run the ground and flight research effort, and interface with the scientific community. The principal scientist is responsible to the program manager and is a critical member of the engineering development team. As a result of this decision, the hardware/experiment requirements were established in such a way as to balance the engineering and science demands on the equipment. Program schedules for hardware development, experiment definition and material selection, flight operations development and crew training, both ground support and astronauts, were all planned and carried out with the understanding that the success of the program science

  14. Size effects on insect hovering aerodynamics: an integrated computational study

    Energy Technology Data Exchange (ETDEWEB)

    Liu, H [Graduate School of Engineering, Chiba University, Chiba, 263-8522 (Japan); Aono, H [Department of Aerospace Engineering, University of Michigan, Ann Arbor, MI48109 (United States)], E-mail: hliu@faculty.chiba-u.jp, E-mail: aonoh@umich.edu

    2009-03-01

    Hovering is a miracle of insects that is observed for all sizes of flying insects. Sizing effect in insect hovering on flapping-wing aerodynamics is of interest to both the micro-air-vehicle (MAV) community and also of importance to comparative morphologists. In this study, we present an integrated computational study of such size effects on insect hovering aerodynamics, which is performed using a biology-inspired dynamic flight simulator that integrates the modelling of realistic wing-body morphology, the modelling of flapping-wing and body kinematics and an in-house Navier-Stokes solver. Results of four typical insect hovering flights including a hawkmoth, a honeybee, a fruit fly and a thrips, over a wide range of Reynolds numbers from O(10{sup 4}) to O(10{sup 1}) are presented, which demonstrate the feasibility of the present integrated computational methods in quantitatively modelling and evaluating the unsteady aerodynamics in insect flapping flight. Our results based on realistically modelling of insect hovering therefore offer an integrated understanding of the near-field vortex dynamics, the far-field wake and downwash structures, and their correlation with the force production in terms of sizing and Reynolds number as well as wing kinematics. Our results not only give an integrated interpretation on the similarity and discrepancy of the near- and far-field vortex structures in insect hovering but also demonstrate that our methods can be an effective tool in the MAVs design.

  15. Fermilab Central Computing Facility: Energy conservation report and mechanical systems design optimization and cost analysis study

    Energy Technology Data Exchange (ETDEWEB)

    Krstulovich, S.F.

    1986-11-12

    This report is developed as part of the Fermilab Central Computing Facility Project Title II Design Documentation Update under the provisions of DOE Document 6430.1, Chapter XIII-21, Section 14, paragraph a. As such, it concentrates primarily on HVAC mechanical systems design optimization and cost analysis and should be considered as a supplement to the Title I Design Report date March 1986 wherein energy related issues are discussed pertaining to building envelope and orientation as well as electrical systems design.

  16. [The Computer Competency of Nurses in Long-Term Care Facilities and Related Factors].

    Science.gov (United States)

    Chang, Ya-Ping; Kuo, Huai-Ting; Li, I-Chuan

    2016-12-01

    It is important for nurses who work in long-term care facilities (LTCFs) to have an adequate level of computer competency due to the multidisciplinary and comprehensive nature of long-term care services. Thus, it is important to understand the current computer competency of nursing staff in LTCFs and the factors that relate to this competency. To explore the computer competency of LTCF nurses and to identify the demographic and computer-usage characteristics that relate significantly to computer competency in the LTCF environment. A cross-sectional research design and a self-report questionnaire were used to collect data from 185 nurses working at LTCFs in Taipei. The results found that the variables of the frequency of computer use (β = .33), age (β = -.30), type(s) of the software used at work (β = .28), hours of on-the-job training (β = -.14), prior work experience at other LTCFs (β = -.14), and Internet use at home (β = .12) explain 58.0% of the variance in the computer competency of participants. The results of the present study suggest that the following measures may help increase the computer competency of LTCF nurses. (1) Nurses should be encouraged to use electronic nursing records rather than handwritten records. (2) On-the-job training programs should emphasize participant competency in the Excel software package in order to maintain efficient and good-quality of LTC services after implementing of the LTC insurance policy.

  17. Laser performance operations model (LPOM): The computational system that automates the setup and performance analysis of the National Ignition Facility

    Science.gov (United States)

    Shaw, Michael; House, Ronald

    2015-02-01

    The National Ignition Facility (NIF) is a stadium-sized facility containing a 192-beam, 1.8 MJ, 500-TW, 351-nm laser system together with a 10-m diameter target chamber with room for many target diagnostics. NIF is the world's largest laser experimental system, providing a national center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. A computational system, the Laser Performance Operations Model (LPOM) has been developed that automates the laser setup process, and accurately predict laser energetics. LPOM uses diagnostic feedback from previous NIF shots to maintain accurate energetics models (gains and losses), as well as links to operational databases to provide `as currently installed' optical layouts for each of the 192 NIF beamlines. LPOM deploys a fully integrated laser physics model, the Virtual Beamline (VBL), in its predictive calculations in order to meet the accuracy requirements of NIF experiments, and to provide the ability to determine the damage risk to optical elements throughout the laser chain. LPOM determines the settings of the injection laser system required to achieve the desired laser output, provides equipment protection, and determines the diagnostic setup. Additionally, LPOM provides real-time post shot data analysis and reporting for each NIF shot. The LPOM computation system is designed as a multi-host computational cluster (with 200 compute nodes, providing the capability to run full NIF simulations fully parallel) to meet the demands of both the controls systems within a shot cycle, and the NIF user community outside of a shot cycle.

  18. Access to the energy system network simulator (ESNS), via remote computer terminals. [BNL CDC 7600/6600 computer facility

    Energy Technology Data Exchange (ETDEWEB)

    Reisman, A W

    1976-08-15

    The Energy System Network Simulator (ESNS) flow model is installed on the Brookhaven National Laboratory (BNL) CDC 7600/6600 computer facility for access by off-site users. The method of access available to outside users is through a system called CDC-INTERCOM, which allows communication between the BNL machines and remote teletype terminals. This write-up gives a brief description of INTERCOM for users unfamiliar with this system and a step-by-step guide to using INTERCOM in order to access ESNS.

  19. Computer integration of engineering design and production: A national opportunity

    Science.gov (United States)

    1984-01-01

    The National Aeronautics and Space Administration (NASA), as a purchaser of a variety of manufactured products, including complex space vehicles and systems, clearly has a stake in the advantages of computer-integrated manufacturing (CIM). Two major NASA objectives are to launch a Manned Space Station by 1992 with a budget of $8 billion, and to be a leader in the development and application of productivity-enhancing technology. At the request of NASA, a National Research Council committee visited five companies that have been leaders in using CIM. Based on these case studies, technical, organizational, and financial issues that influence computer integration are described, guidelines for its implementation in industry are offered, and the use of CIM to manage the space station program is recommended.

  20. Applying Integrated Computer Assisted Media (ICAM in Teaching Vocabulary

    Directory of Open Access Journals (Sweden)

    Opick Dwi Indah

    2015-02-01

    Full Text Available The objective of this research was to find out whether the use of integrated computer assisted media (ICAM is effective to improve the vocabulary achievement of the second semester students of Cokroaminoto Palopo University. The population of this research was the second semester students of English department of Cokroaminoto Palopo University in academic year 2013/2014. The samples of this research were 60 students and they were placed into two groups: experimental and control group where each group consisted of 30 students. This research used cluster random sampling technique. The research data was collected by applying vocabulary test and it was analyzed by using descriptive and inferential statistics. The result of this research was integrated computer assisted media (ICAM can improve vocabulary achievement of the students of English department of Cokroaminoto Palopo University. It can be concluded that the use of ICAM in the teaching vocabulary is effective to be implemented in improving the students’ vocabulary achievement.

  1. Statistical Methodologies to Integrate Experimental and Computational Research

    Science.gov (United States)

    Parker, P. A.; Johnson, R. T.; Montgomery, D. C.

    2008-01-01

    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.

  2. Multilayer microwave integrated quantum circuits for scalable quantum computing

    Science.gov (United States)

    Brecht, Teresa; Pfaff, Wolfgang; Wang, Chen; Chu, Yiwen; Frunzio, Luigi; Devoret, Michel H.; Schoelkopf, Robert J.

    2016-02-01

    As experimental quantum information processing (QIP) rapidly advances, an emerging challenge is to design a scalable architecture that combines various quantum elements into a complex device without compromising their performance. In particular, superconducting quantum circuits have successfully demonstrated many of the requirements for quantum computing, including coherence levels that approach the thresholds for scaling. However, it remains challenging to couple a large number of circuit components through controllable channels while suppressing any other interactions. We propose a hardware platform intended to address these challenges, which combines the advantages of integrated circuit fabrication and the long coherence times achievable in three-dimensional circuit quantum electrodynamics. This multilayer microwave integrated quantum circuit platform provides a path towards the realisation of increasingly complex superconducting devices in pursuit of a scalable quantum computer.

  3. Integrating computer technology in the teaching of Biology

    OpenAIRE

    GarrawayLashley, Yassanne

    2014-01-01

    Over the past decade, the number of students who gained satisfactory passes at the Caribbean Secondary Education Certificate (CSEC) in Biology in Guyana has been few. This poor performance may be attributed to the traditional method of teaching that was used to teach Biology. This study therefore ascertained if the integration of computer technology into the teaching of Biology would enhance students’ academic performance. The study was guided by a null research hypothesis. Hence, the related...

  4. CONSTRUCTION COST INTEGRATED CONTROL BASED ON COMPUTER SIMULATION

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Construction cost control is a complex system engineering. Thetraditional controlling method cannot dynamically control in advance the construction cost because of its hysteresis. This paper proposes a computer simulation based construction cost integrated control method, which combines the cost with PERT systematically, so that the construction cost can be predicted and optimized systematically and effectively. The new method overcomes the hysteresis of the traditional systems, and is a distinct improvement over them in effect and practicality.

  5. Integrated flight propulsion control research results using the NASA F-15 HIDEC Flight Research Facility

    Science.gov (United States)

    Stewart, James F.

    1992-01-01

    Over the last two decades, NASA has conducted several flight research experiments in integrated flight propulsion control. Benefits have included increased thrust, range, and survivability; reduced fuel consumption; and reduced maintenance. These flight programs were flown at NASA Dryden Flight Research Facility. This paper presents the basic concepts for control integration, examples of implementation, and benefits of integrated flight propulsion control systems. The F-15 research involved integration of the engine, flight, and inlet control systems. Further extension of the integration included real time, onboard optimization of engine, inlet, and flight control variables; a self repairing flight control system; and an engines only control concept for emergency control. The flight research programs and the resulting benefits are described for the F-15 research.

  6. Integration of the Chinese HPC Grid in ATLAS Distributed Computing

    CERN Document Server

    Filip\\v{c}i\\v{c}, Andrej; The ATLAS collaboration

    2016-01-01

    Fifteen Chinese High Performance Computing sites, many of them on the TOP500 list of most powerful supercomputers, are integrated into a common infrastructure providing coherent access to a user through an interface based on a RESTful interface called SCEAPI. These resources have been integrated into the ATLAS Grid production system using a bridge between ATLAS and SCEAPI which translates the authorization and job submission protocols between the two environments. The ARC Computing Element (ARC CE) forms the bridge using an extended batch system interface to allow job submission to SCEAPI. The ARC CE was setup at the Institute for High Energy Physics, Beijing, in order to be as close as possible to the SCEAPI front-end interface at the Computing Network Information Center, also in Beijing. This paper describes the technical details of the integration between ARC CE and SCEAPI and presents results so far with two supercomputer centers, Tianhe-IA and ERA. These two centers have been the pilots for ATLAS Monte C...

  7. Integration of the Chinese HPC Grid in ATLAS Distributed Computing

    CERN Document Server

    Filipcic, Andrej; The ATLAS collaboration

    2017-01-01

    Fifteen Chinese High Performance Computing sites, many of them on the TOP500 list of most powerful supercomputers, are integrated into a common infrastructure providing coherent access to a user through an interface based on a RESTful interface called SCEAPI. These resources have been integrated into the ATLAS Grid production system using a bridge between ATLAS and SCEAPI which translates the authorization and job submission protocols between the two environments. The ARC Computing Element (ARC CE) forms the bridge using an extended batch system interface to allow job submission to SCEAPI. The ARC CE was setup at the Institute for High Energy Physics, Beijing, in order to be as close as possible to the SCEAPI front-end interface at the Computing Network Information Center, also in Beijing. This paper describes the technical details of the integration between ARC CE and SCEAPI and presents results so far with two supercomputer centers, Tianhe-IA and ERA. These two centers have been the pilots for ATLAS Monte C...

  8. Structural integrity assessment based on the HFR Petten neutron beam facilities

    CERN Document Server

    Ohms, C; Idsert, P V D

    2002-01-01

    Neutrons are becoming recognized as a valuable tool for structural-integrity assessment of industrial components and advanced materials development. Microstructure, texture and residual stress analyses are commonly performed by neutron diffraction and a joint CEN/ISO Pre-Standard for residual stress analysis is under development. Furthermore neutrons provide for defects analyses, i.e. precipitations, voids, pores and cracks, through small-angle neutron scattering (SANS) or radiography. At the High Flux Reactor, 12 beam tubes have been installed for the extraction of thermal neutrons for such applications. Two of them are equipped with neutron diffractometers for residual stress and structure determination and have been extensively used in the past. Several other facilities are currently being reactivated and upgraded. These include the SANS and radiography facilities as well as a powder diffractometer. This paper summarizes the main characteristics and current status of these facilities as well as recently in...

  9. An integrated prediction and optimization model of biogas production system at a wastewater treatment facility.

    Science.gov (United States)

    Akbaş, Halil; Bilgen, Bilge; Turhan, Aykut Melih

    2015-11-01

    This study proposes an integrated prediction and optimization model by using multi-layer perceptron neural network and particle swarm optimization techniques. Three different objective functions are formulated. The first one is the maximization of methane percentage with single output. The second one is the maximization of biogas production with single output. The last one is the maximization of biogas quality and biogas production with two outputs. Methane percentage, carbon dioxide percentage, and other contents' percentage are used as the biogas quality criteria. Based on the formulated models and data from a wastewater treatment facility, optimal values of input variables and their corresponding maximum output values are found out for each model. It is expected that the application of the integrated prediction and optimization models increases the biogas production and biogas quality, and contributes to the quantity of electricity production at the wastewater treatment facility.

  10. An Integrated Assessment of Location-Dependent Scaling for Microalgae Biofuel Production Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Andre M.; Abodeely, Jared; Skaggs, Richard; Moeglein, William AM; Newby, Deborah T.; Venteris, Erik R.; Wigmosta, Mark S.

    2014-07-01

    Successful development of a large-scale microalgae-based biofuels industry requires comprehensive analysis and understanding of the feedstock supply chain—from facility siting/design through processing/upgrading of the feedstock to a fuel product. The evolution from pilot-scale production facilities to energy-scale operations presents many multi-disciplinary challenges, including a sustainable supply of water and nutrients, operational and infrastructure logistics, and economic competitiveness with petroleum-based fuels. These challenges are addressed in part by applying the Integrated Assessment Framework (IAF)—an integrated multi-scale modeling, analysis, and data management suite—to address key issues in developing and operating an open-pond facility by analyzing how variability and uncertainty in space and time affect algal feedstock production rates, and determining the site-specific “optimum” facility scale to minimize capital and operational expenses. This approach explicitly and systematically assesses the interdependence of biofuel production potential, associated resource requirements, and production system design trade-offs. The IAF was applied to a set of sites previously identified as having the potential to cumulatively produce 5 billion-gallons/year in the southeastern U.S. and results indicate costs can be reduced by selecting the most effective processing technology pathway and scaling downstream processing capabilities to fit site-specific growing conditions, available resources, and algal strains.

  11. SynapSense Wireless Environmental Monitoring System of the RHIC & ATLAS Computing Facility at BNL

    Science.gov (United States)

    Casella, K.; Garcia, E.; Hogue, R.; Hollowell, C.; Strecker-Kellogg, W.; Wong, A.; Zaytsev, A.

    2014-06-01

    RHIC & ATLAS Computing Facility (RACF) at BNL is a 15000 sq. ft. facility hosting the IT equipment of the BNL ATLAS WLCG Tier-1 site, offline farms for the STAR and PHENIX experiments operating at the Relativistic Heavy Ion Collider (RHIC), the BNL Cloud installation, various Open Science Grid (OSG) resources, and many other small physics research oriented IT installations. The facility originated in 1990 and grew steadily up to the present configuration with 4 physically isolated IT areas with the maximum rack capacity of about 1000 racks and the total peak power consumption of 1.5 MW. In June 2012 a project was initiated with the primary goal to replace several environmental monitoring systems deployed earlier within RACF with a single commercial hardware and software solution by SynapSense Corporation based on wireless sensor groups and proprietary SynapSense™ MapSense™ software that offers a unified solution for monitoring the temperature and humidity within the rack/CRAC units as well as pressure distribution underneath the raised floor across the entire facility. The deployment was completed successfully in 2013. The new system also supports a set of additional features such as capacity planning based on measurements of total heat load, power consumption monitoring and control, CRAC unit power consumption optimization based on feedback from the temperature measurements and overall power usage efficiency estimations that are not currently implemented within RACF but may be deployed in the future.

  12. Dynamic integration of remote cloud resources into local computing clusters

    Energy Technology Data Exchange (ETDEWEB)

    Fleig, Georg; Erli, Guenther; Giffels, Manuel; Hauth, Thomas; Quast, Guenter; Schnepf, Matthias [Institut fuer Experimentelle Kernphysik, Karlsruher Institut fuer Technologie (Germany)

    2016-07-01

    In modern high-energy physics (HEP) experiments enormous amounts of data are analyzed and simulated. Traditionally dedicated HEP computing centers are built or extended to meet this steadily increasing demand for computing resources. Nowadays it is more reasonable and more flexible to utilize computing power at remote data centers providing regular cloud services to users as they can be operated in a more efficient manner. This approach uses virtualization and allows the HEP community to run virtual machines containing a dedicated operating system and transparent access to the required software stack on almost any cloud site. The dynamic management of virtual machines depending on the demand for computing power is essential for cost efficient operation and sharing of resources with other communities. For this purpose the EKP developed the on-demand cloud manager ROCED for dynamic instantiation and integration of virtualized worker nodes into the institute's computing cluster. This contribution will report on the concept of our cloud manager and the implementation utilizing a remote OpenStack cloud site and a shared HPC center (bwForCluster located in Freiburg).

  13. Checkout and start-up of the integrated DWPF (Defense Waste Processing Facility) melter system

    Energy Technology Data Exchange (ETDEWEB)

    Smith, M.E.; Hutson, N.D.; Miller, D.H.; Morrison, J.; Shah, H.; Shuford, J.A.; Glascock, J.; Wurzinger, F.H.; Zamecnik, J.R.

    1989-11-11

    The Integrated DWPF Melter System (IDMS) is a one-ninth-scale demonstration of the Defense Waste Processing Facility (DWPF) feed preparation, melter, and off-gas systems. The IDMS will be the first engineering-scale melter system at SRL to process mercury and flowsheet levels of halides and sulfates. This report includes a summary of the IDMS program objectives, system and equipment descriptions, and detailed discussions of the system checkout and start-up. 10 refs., 44 figs., 20 tabs.

  14. Track Reconstruction with Cosmic Ray Data at the Tracker Integration Facility

    CERN Document Server

    Adam, Wolfgang; Dragicevic, Marko; Friedl, Markus; Fruhwirth, R; Hansel, S; Hrubec, Josef; Krammer, Manfred; Oberegger, Margit; Pernicka, Manfred; Schmid, Siegfried; Stark, Roland; Steininger, Helmut; Uhl, Dieter; Waltenberger, Wolfgang; Widl, Edmund; Van Mechelen, Pierre; Cardaci, Marco; Beaumont, Willem; de Langhe, Eric; de Wolf, Eddi A; Delmeire, Evelyne; Hashemi, Majid; Bouhali, Othmane; Charaf, Otman; Clerbaux, Barbara; Elgammal, J.-P. Dewulf. S; Hammad, Gregory Habib; de Lentdecker, Gilles; Marage, Pierre Edouard; Vander Velde, Catherine; Vanlaer, Pascal; Wickens, John; Adler, Volker; Devroede, Olivier; De Weirdt, Stijn; D'Hondt, Jorgen; Goorens, Robert; Heyninck, Jan; Maes, Joris; Mozer, Matthias Ulrich; Tavernier, Stefaan; Van Lancker, Luc; Van Mulders, Petra; Villella, Ilaria; Wastiels, C; Bonnet, Jean-Luc; Bruno, Giacomo; De Callatay, Bernard; Florins, Benoit; Giammanco, Andrea; Gregoire, Ghislain; Keutgen, Thomas; Kcira, Dorian; Lemaitre, Vincent; Michotte, Daniel; Militaru, Otilia; Piotrzkowski, Krzysztof; Quertermont, L; Roberfroid, Vincent; Rouby, Xavier; Teyssier, Daniel; Daubie, Evelyne; Anttila, Erkki; Czellar, Sandor; Engstrom, Pauli; Harkonen, J; Karimaki, V; Kostesmaa, J; Kuronen, Auli; Lampen, Tapio; Linden, Tomas; Luukka, Panja-Riina; Maenpaa, T; Michal, Sebastien; Tuominen, Eija; Tuominiemi, Jorma; Ageron, Michel; Baulieu, Guillaume; Bonnevaux, Alain; Boudoul, Gaelle; Chabanat, Eric; Chabert, Eric Christian; Chierici, Roberto; Contardo, Didier; Della Negra, Rodolphe; Dupasquier, Thierry; Gelin, Georges; Giraud, Noël; Guillot, Gérard; Estre, Nicolas; Haroutunian, Roger; Lumb, Nicholas; Perries, Stephane; Schirra, Florent; Trocme, Benjamin; Vanzetto, Sylvain; Agram, Jean-Laurent; Blaes, Reiner; Drouhin, Frédéric; Ernenwein, Jean-Pierre; Fontaine, Jean-Charles; Berst, Jean-Daniel; Brom, Jean-Marie; Didierjean, Francois; Goerlach, Ulrich; Graehling, Philippe; Gross, Laurent; Hosselet, J; Juillot, Pierre; Lounis, Abdenour; Maazouzi, Chaker; Olivetto, Christian; Strub, Roger; Van Hove, Pierre; Anagnostou, Georgios; Brauer, Richard; Esser, Hans; Feld, Lutz; Karpinski, Waclaw; Klein, Katja; Kukulies, Christoph; Olzem, Jan; Ostapchuk, Andrey; Pandoulas, Demetrios; Pierschel, Gerhard; Raupach, Frank; Schael, Stefan; Schwering, Georg; Sprenger, Daniel; Thomas, Maarten; Weber, Markus; Wittmer, Bruno; Wlochal, Michael; Beissel, Franz; Bock, E; Flugge, G; Gillissen, C; Hermanns, Thomas; Heydhausen, Dirk; Jahn, Dieter; Kaussen, Gordon; Linn, Alexander; Perchalla, Lars; Poettgens, Michael; Pooth, Oliver; Stahl, Achim; Zoeller, Marc Henning; Buhmann, Peter; Butz, Erik; Flucke, Gero; Hamdorf, Richard Helmut; Hauk, Johannes; Klanner, Robert; Pein, Uwe; Schleper, Peter; Steinbruck, G; Blum, P; De Boer, Wim; Dierlamm, Alexander; Dirkes, Guido; Fahrer, Manuel; Frey, Martin; Furgeri, Alexander; Hartmann, Frank; Heier, Stefan; Hoffmann, Karl-Heinz; Kaminski, Jochen; Ledermann, Bernhard; Liamsuwan, Thiansin; Muller, S; Muller, Th; Schilling, Frank-Peter; Simonis, Hans-Jürgen; Steck, Pia; Zhukov, Valery; Cariola, P; De Robertis, Giuseppe; Ferorelli, Raffaele; Fiore, Luigi; Preda, M; Sala, Giuliano; Silvestris, Lucia; Tempesta, Paolo; Zito, Giuseppe; Creanza, Donato; De Filippis, Nicola; De Palma, Mauro; Giordano, Domenico; Maggi, Giorgio; Manna, Norman; My, Salvatore; Selvaggi, Giovanna; Albergo, Sebastiano; Chiorboli, Massimiliano; Costa, Salvatore; Galanti, Mario; Giudice, Nunzio; Guardone, Nunzio; Noto, Francesco; Potenza, Renato; Saizu, Mirela Angela; Sparti, V; Sutera, Concetta; Tricomi, Alessia; Tuve, Cristina; Brianzi, Mirko; Civinini, Carlo; Maletta, Fernando; Manolescu, Florentina; Meschini, Marco; Paoletti, Simone; Sguazzoni, Giacomo; Broccolo, B; Ciulli, Vitaliano; Focardi, R. D'Alessandro. E; Frosali, Simone; Genta, Chiara; Landi, Gregorio; Lenzi, Piergiulio; Macchiolo, Anna; Magini, Nicolo; Parrini, Giuliano; Scarlini, Enrico; Cerati, Giuseppe Benedetto; Azzi, Patrizia; Bacchetta, Nicola; Candelori, Andrea; Dorigo, Tommaso; Kaminsky, A; Karaevski, S; Khomenkov, Volodymyr; Reznikov, Sergey; Tessaro, Mario; Bisello, Dario; De Mattia, Marco; Giubilato, Piero; Loreti, Maurizio; Mattiazzo, Serena; Nigro, Massimo; Paccagnella, Alessandro; Pantano, Devis; Pozzobon, Nicola; Tosi, Mia; Bilei, Gian Mario; Checcucci, Bruno; Fano, Livio; Servoli, Leonello; Ambroglini, Filippo; Babucci, Ezio; Benedetti, Daniele; Biasini, Maurizio; Caponeri, Benedetta; Covarelli, Roberto; Giorgi, Marco; Lariccia, Paolo; Mantovani, Giancarlo; Marcantonini, Marta; Postolache, Vasile; Santocchia, Attilio; Spiga, Daniele; Bagliesi, Giuseppe; Balestri, Gabriele; Berretta, Luca; Bianucci, S; Boccali, Tommaso; Bosi, Filippo; Bracci, Fabrizio; Castaldi, Rino; Ceccanti, Marco; Cecchi, Roberto; Cerri, Claudio; Cucoanes, Andi Sebastian; Dell'Orso, Roberto; Dobur, Didar; Dutta, Suchandra; Giassi, Alessandro; Giusti, Simone; Kartashov, Dmitry; Kraan, Aafke; Lomtadze, Teimuraz; Lungu, George-Adrian; Magazzu, Guido; Mammini, Paolo; Mariani, Filippo; Martinelli, Giovanni; Moggi, Andrea; Palla, Fabrizio; Palmonari, Francesco; Petragnani, Giulio; Profeti, Alessandro; Raffaelli, Fabrizio; Rizzi, Domenico; Sanguinetti, Giulio; Sarkar, Subir; Sentenac, Daniel; Serban, Alin Titus; Slav, Adrian; Soldani, A; Spagnolo, Paolo; Tenchini, Roberto; Tolaini, Sergio; Venturi, Andrea; Verdini, Piero Giorgio; Vos, Marcel; Zaccarelli, Luciano; Avanzini, Carlo; Basti, Andrea; Benucci, Leonardo; Bocci, Andrea; Cazzola, Ugo; Fiori, Francesco; Linari, Stefano; Massa, Maurizio; Messineo, Alberto; Segneri, Gabriele; Tonelli, Guido; Azzurri, Paolo; Bernardini, Jacopo; Borrello, Laura; Calzolari, Federico; Foa, Lorenzo; Gennai, Simone; Ligabue, Franco; Petrucciani, Giovanni; Rizzi, Andrea; Yang, Zong-Chang; Benotto, Franco; Demaria, Natale; Dumitrache, Floarea; Farano, R; Borgia, Maria Assunta; Castello, Roberto; Costa, Marco; Migliore, Ernesto; Romero, Alessandra; Abbaneo, Duccio; Abbas, M; Ahmed, Ijaz; Akhtar, I; Albert, Eric; Bloch, Christoph; Breuker, Horst; Butt, Shahid Aleem; Buchmuller, Oliver; Cattai, Ariella; Delaere, Christophe; Delattre, Michel; Edera, Laura Maria; Engstrom, Pauli; Eppard, Michael; Gateau, Maryline; Gill, Karl; Giolo-Nicollerat, Anne-Sylvie; Grabit, Robert; Honma, Alan; Huhtinen, Mika; Kloukinas, Kostas; Kortesmaa, Jarmo; Kottelat, Luc-Joseph; Kuronen, Auli; Leonardo, Nuno; Ljuslin, Christer; Mannelli, Marcello; Masetti, Lorenzo; Marchioro, Alessandro; Mersi, Stefano; Michal, Sebastien; Mirabito, Laurent; Muffat-Joly, Jeannine; Onnela, Antti; Paillard, Christian; Pal, Imre; Pernot, Jean-Francois; Petagna, Paolo; Petit, Patrick; Piccut, C; Pioppi, Michele; Postema, Hans; Ranieri, Riccardo; Ricci, Daniel; Rolandi, Gigi; Ronga, Frederic Jean; Sigaud, Christophe; Syed, A; Siegrist, Patrice; Tropea, Paola; Troska, Jan; Tsirou, Andromachi; Vander Donckt, Muriel; Vasey, François; Alagoz, Enver; Amsler, Claude; Chiochia, Vincenzo; Regenfus, Christian; Robmann, Peter; Rochet, Jacky; Rommerskirchen, Tanja; Schmidt, Alexander; Steiner, Stefan; Wilke, Lotte; Church, Ivan; Cole, Joanne; Coughlan, John A; Gay, Arnaud; Taghavi, S; Tomalin, Ian R; Bainbridge, Robert; Cripps, Nicholas; Fulcher, Jonathan; Hall, Geoffrey; Noy, Matthew; Pesaresi, Mark; Radicci, Valeria; Raymond, David Mark; Sharp, Peter; Stoye, Markus; Wingham, Matthew; Zorba, Osman; Goitom, Israel; Hobson, Peter R; Reid, Ivan; Teodorescu, Liliana; Hanson, Gail; Jeng, Geng-Yuan; Liu, Haidong; Pasztor, Gabriella; Satpathy, Asish; Stringer, Robert; Mangano, Boris; Affolder, K; Affolder, T; Allen, Andrea; Barge, Derek; Burke, Samuel; Callahan, D; Campagnari, Claudio; Crook, A; D'Alfonso, Mariarosaria; Dietch, J; Garberson, Jeffrey; Hale, David; Incandela, H; Incandela, Joe; Jaditz, Stephen; Kalavase, Puneeth; Kreyer, Steven Lawrence; Kyre, Susanne; Lamb, James; Mc Guinness, C; Mills, C; Nguyen, Harold; Nikolic, Milan; Lowette, Steven; Rebassoo, Finn; Ribnik, Jacob; Richman, Jeffrey; Rubinstein, Noah; Sanhueza, S; Shah, Yousaf Syed; Simms, L; Staszak, D; Stoner, J; Stuart, David; Swain, Sanjay Kumar; Vlimant, Jean-Roch; White, Dean; Ulmer, Keith; Wagner, Stephen Robert; Bagby, Linda; Bhat, Pushpalatha C; Burkett, Kevin; Cihangir, Selcuk; Gutsche, Oliver; Jensen, Hans; Johnson, Mark; Luzhetskiy, Nikolay; Mason, David; Miao, Ting; Moccia, Stefano; Noeding, Carsten; Ronzhin, Anatoly; Skup, Ewa; Spalding, William J; Spiegel, Leonard; Tkaczyk, Slawek; Yumiceva, Francisco; Zatserklyaniy, Andriy; Zerev, E; Anghel, Ioana Maria; Bazterra, Victor Eduardo; Gerber, Cecilia Elena; Khalatian, S; Shabalina, Elizaveta; Baringer, Philip; Bean, Alice; Chen, Jie; Hinchey, Carl Louis; Martin, Christophe; Moulik, Tania; Robinson, Richard; Gritsan, Andrei; Lae, Chung Khim; Tran, Nhan Viet; Everaerts, Pieter; Hahn, Kristan Allan; Harris, Philip; Nahn, Steve; Rudolph, Matthew; Sung, Kevin; Betchart, Burton; Demina, Regina; Gotra, Yury; Korjenevski, Sergey; Miner, Daniel Carl; Orbaker, Douglas; Christofek, Leonard; Hooper, Ryan; Landsberg, Greg; Nguyen, Duong; Narain, Meenakshi; Speer, Thomas; Tsang, Ka Vang

    2008-01-01

    The subsystems of the CMS silicon strip tracker were integrated and commissioned at the Tracker Integration Facility (TIF) in the period from November 2006 to July 2007. As part of the commissioning, large samples of cosmic ray data were recorded under various running conditions in the absence of a magnetic field. Cosmic rays detected by scintillation counters were used to trigger the readout of up to 15\\,\\% of the final silicon strip detector, and over 4.7~million events were recorded. This document describes the cosmic track reconstruction and presents results on the performance of track and hit reconstruction as from dedicated analyses.

  15. The integration of microgravity science experiments into shared or previously existing experiment facilities

    Science.gov (United States)

    Baer-Peckham, M. S.; Mccarley, K. S.

    1991-01-01

    The overall flow for integrating a sample into an experiment facility, specifically materials science is discussed using the Crystal Growth Furnace as an example. A typical preflight timeline for an experiment is discussed, including identification of all documentation and hardware deliveries. Each of the items presented is discussed in detail including the experiment requirements document, the announcement opportunity response, the experiment specific equipment, safety reviews, mission plan, and hardware integration plan. These items are addressed both individualy and with respect to their relevance to the program as a whole.

  16. The integration of microgravity science experiments into shared or previously existing experiment facilities

    Science.gov (United States)

    Baer-Peckham, M. S.; Mccarley, K. S.

    1991-01-01

    The overall flow for integrating a sample into an experiment facility, specifically materials science is discussed using the Crystal Growth Furnace as an example. A typical preflight timeline for an experiment is discussed, including identification of all documentation and hardware deliveries. Each of the items presented is discussed in detail including the experiment requirements document, the announcement opportunity response, the experiment specific equipment, safety reviews, mission plan, and hardware integration plan. These items are addressed both individualy and with respect to their relevance to the program as a whole.

  17. An integrated computer aided system for integrated design of chemical processes

    DEFF Research Database (Denmark)

    Gani, Rafiqul; Hytoft, Glen; Jaksland, Cecilia

    1997-01-01

    In this paper, an Integrated Computer Aided System (ICAS), which is particularly suitable for solving problems related to integrated design of chemical processes; is presented. ICAS features include a model generator (generation of problem specific models including model simplification and model...... reduction), a simulator (use of problem specific simulation strategies for steady state and dynamic simulation), toolboxes (thermodynamic toolbox, synthesis toolbox, control toolbox, design toolbox and analysis toolbox), and an interface for problem defintion. Each toolbox solves aspecific set of problems...... and communicates with all other computational tools available in ICAS. A large range of thermodynamic models for estimation of the necessary thermo-physical properties, a large range of computational algorithms for determination of various types of phase diagrams, algorithms for process synthesis, design, control...

  18. Integration of the SSPM and STAGE with the MPACT Virtual Facility Distributed Test Bed.

    Energy Technology Data Exchange (ETDEWEB)

    Cipiti, Benjamin B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shoman, Nathan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    The Material Protection Accounting and Control Technologies (MPACT) program within DOE NE is working toward a 2020 milestone to demonstrate a Virtual Facility Distributed Test Bed. The goal of the Virtual Test Bed is to link all MPACT modeling tools, technology development, and experimental work to create a Safeguards and Security by Design capability for fuel cycle facilities. The Separation and Safeguards Performance Model (SSPM) forms the core safeguards analysis tool, and the Scenario Toolkit and Generation Environment (STAGE) code forms the core physical security tool. These models are used to design and analyze safeguards and security systems and generate performance metrics. Work over the past year has focused on how these models will integrate with the other capabilities in the MPACT program and specific model changes to enable more streamlined integration in the future. This report describes the model changes and plans for how the models will be used more collaboratively. The Virtual Facility is not designed to integrate all capabilities into one master code, but rather to maintain stand-alone capabilities that communicate results between codes more effectively.

  19. DOE High Performance Computing Operational Review (HPCOR): Enabling Data-Driven Scientific Discovery at HPC Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard; Allcock, William; Beggio, Chris; Campbell, Stuart; Cherry, Andrew; Cholia, Shreyas; Dart, Eli; England, Clay; Fahey, Tim; Foertter, Fernanda; Goldstone, Robin; Hick, Jason; Karelitz, David; Kelly, Kaki; Monroe, Laura; Prabhat,; Skinner, David; White, Julia

    2014-10-17

    U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at the DOE national laboratories. The report contains findings from that review.

  20. Integrated Nanophotonic Silicon Devices for Next Generation Computing Chips

    Science.gov (United States)

    Djordjevic, Stevan

    Development of the computing platform of the future depends largely on high bandwidth interconnects at intra-die level. Silicon photonics, as an innately CMOS compatible technology, is a promising candidate for delivering terabit per second bandwidths through the use of wavelength division multiplex (WDM) signaling. Silicon photonic interconnects offer unmatched bandwidth, density, energy efficiency, latency and reach, compared with the electrical interconnects. WDM silicon photonic links are viewed today as a promising solution for resolving the inter/intra-chip communication bottlenecks for high performance computing systems. Towards its maturity, silicon photonic technology has to resolve the issues of waveguide propagation loss, density of device integration, thermal stability of resonant devices, heterogeneous integration of various materials and many other problems. This dissertation describes the development of integrated photonic technology on silicon and silicon nitride platforms in the increased order of device complexity, from the fabrication process of low loss waveguides and efficient off-chip coupling devices, to the die-size reconfigurable lattice filters for optical signal processing. Particular emphasis of the dissertation is on the demonstration of CMOS-compatible, athermal silicon ring modulators that potentially hold the key to solving the thermal problem of silicon photonic devices. The development of high quality amorphous titanium dioxide films with negative thermo-optic coefficient enabled the fabrication of gigahertz-bandwidth silicon ring modulators that can be made insensitive to ambient temperature changes.

  1. Improvement of the Computing - Related Procurement Process at a Government Research Facility

    Energy Technology Data Exchange (ETDEWEB)

    Gittins, C.

    2000-04-03

    The purpose of the project was to develop, implement, and market value-added services through the Computing Resource Center in an effort to streamline computing-related procurement processes across the Lawrence Livermore National Laboratory (LLNL). The power of the project was in focusing attention on and value of centralizing the delivery of computer related products and services to the institution. The project required a plan and marketing strategy that would drive attention to the facility's value-added offerings and services. A significant outcome of the project has been the change in the CRC internal organization. The realignment of internal policies and practices, together with additions to its product and service offerings has brought an increased focus to the facility. This movement from a small, fractious organization into one that is still small yet well organized and focused on its mission and goals has been a significant transition. Indicative of this turnaround was the sharing of information. One-on-one and small group meetings, together with statistics showing work activity was invaluable in gaining support for more equitable workload distribution, and the removal of blame and finger pointing. Sharing monthly reports on sales and operating costs also had a positive impact.

  2. Advanced data analysis in neuroscience integrating statistical and computational models

    CERN Document Server

    Durstewitz, Daniel

    2017-01-01

    This book is intended for use in advanced graduate courses in statistics / machine learning, as well as for all experimental neuroscientists seeking to understand statistical methods at a deeper level, and theoretical neuroscientists with a limited background in statistics. It reviews almost all areas of applied statistics, from basic statistical estimation and test theory, linear and nonlinear approaches for regression and classification, to model selection and methods for dimensionality reduction, density estimation and unsupervised clustering.  Its focus, however, is linear and nonlinear time series analysis from a dynamical systems perspective, based on which it aims to convey an understanding also of the dynamical mechanisms that could have generated observed time series. Further, it integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. This way computational models in neuroscience are not only explanat ory frameworks, but become powerfu...

  3. CPP-603 Underwater Fuel Storage Facility Site Integrated Stabilization Management Plan (SISMP), Volume I

    Energy Technology Data Exchange (ETDEWEB)

    Denney, R.D.

    1995-10-01

    The CPP-603 Underwater Fuel Storage Facility (UFSF) Site Integrated Stabilization Management Plan (SISMP) has been constructed to describe the activities required for the relocation of spent nuclear fuel (SNF) from the CPP-603 facility. These activities are the only Idaho National Engineering Laboratory (INEL) actions identified in the Implementation Plan developed to meet the requirements of the Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 94-1 to the Secretary of Energy regarding an improved schedule for remediation in the Defense Nuclear Facilities Complex. As described in the DNFSB Recommendation 94-1 Implementation Plan, issued February 28, 1995, an INEL Spent Nuclear Fuel Management Plan is currently under development to direct the placement of SNF currently in existing INEL facilities into interim storage, and to address the coordination of intrasite SNF movements with new receipts and intersite transfers that were identified in the DOE SNF Programmatic and INEL Environmental Restoration and Waste Management Environmental Impact Statement Record, of Decision. This SISMP will be a subset of the INEL Spent Nuclear Fuel Management Plan and the activities described are being coordinated with other INEL SNF management activities. The CPP-603 relocation activities have been assigned a high priority so that established milestones will be meet, but there will be some cases where other activities will take precedence in utilization of available resources. The Draft INEL Site Integrated Stabilization Management Plan (SISMP), INEL-94/0279, Draft Rev. 2, dated March 10, 1995, is being superseded by the INEL Spent Nuclear Fuel Management Plan and this CPP-603 specific SISMP.

  4. Computational modeling of red blood cells: A symplectic integration algorithm

    Science.gov (United States)

    Schiller, Ulf D.; Ladd, Anthony J. C.

    2010-03-01

    Red blood cells can undergo shape transformations that impact the rheological properties of blood. Computational models have to account for the deformability and red blood cells are often modeled as elastically deformable objects. We present a symplectic integration algorithm for deformable objects. The surface is represented by a set of marker points obtained by surface triangulation, along with a set of fiber vectors that describe the orientation of the material plane. The various elastic energies are formulated in terms of these variables and the equations of motion are obtained by exact differentiation of a discretized Hamiltonian. The integration algorithm preserves the Hamiltonian structure and leads to highly accurate energy conservation, hence he method is expected to be more stable than conventional finite element methods. We apply the algorithm to simulate the shape dynamics of red blood cells.

  5. Distributed computer control system in the Nova Laser Fusion Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    1985-09-01

    The EE Technical Review has two purposes - to inform readers of various activities within the Electronics Engineering Department and to promote the exchange of ideas. The articles, by design, are brief summaries of EE work. The articles included in this report are as follows: Overview - Nova Control System; Centralized Computer-Based Controls for the Nova Laser Facility; Nova Pulse-Power Control System; Nova Laser Alignment Control System; Nova Beam Diagnostic System; Nova Target-Diagnostics Control System; and Nova Shot Scheduler. The 7 papers are individually abstracted.

  6. Integrating GPGPU computations with CPU coroutines in C++

    Science.gov (United States)

    Lebedev, Pavel A.

    2016-02-01

    We present results on integration of two major GPGPU APIs with reactor-based event processing model in C++ that utilizes coroutines. With current lack of universally usable GPGPU programming interface that gives optimal performance and debates about the style of implementing asynchronous computing in C++, we present a working implementation that allows a uniform and seamless approach to writing C++ code with continuations that allow processing on CPUs or CUDA/OpenCL accelerators. Performance results are provided that show, if corner cases are avoided, this approach has negligible performance cost on latency.

  7. Integration of rocket turbine design and analysis through computer graphics

    Science.gov (United States)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  8. Fully integrated wireless inductive tongue computer interface for disabled people.

    Science.gov (United States)

    Struijk, Lotte N S Andreasen; Lontis, Eugen Romulus; Bentsen, Bo; Christensen, Henrik Vie; Caltenco, Hector A; Lund, Morten Enemark

    2009-01-01

    This work describes a novel fully integrated inductive tongue computer interface for disabled people. The interface consists of an oral unit placed in the mouth, including inductive sensors, related electronics, a system for wireless transmission and a rechargeable battery. The system is activated using an activation unit placed on the tongue, and incorporates 18 inductive sensors, arranged in both a key area and a mouse-pad area. The system's functionality was demonstrated in a pilot experiment, where a typing rate of up to 70 characters/minute was obtained with an error rate of 3%. Future work will include tests with disabled subjects.

  9. The HEPCloud Facility: elastic computing for High Energy Physics – The NOvA Use Case

    Energy Technology Data Exchange (ETDEWEB)

    Fuess, S. [Fermilab; Garzoglio, G. [Fermilab; Holzman, B. [Fermilab; Kennedy, R. [Fermilab; Norman, A. [Fermilab; Timm, S. [Fermilab; Tiradani, A. [Fermilab

    2017-03-15

    The need for computing in the HEP community follows cycles of peaks and valleys mainly driven by conference dates, accelerator shutdown, holiday schedules, and other factors. Because of this, the classical method of provisioning these resources at providing facilities has drawbacks such as potential overprovisioning. As the appetite for computing increases, however, so does the need to maximize cost efficiency by developing a model for dynamically provisioning resources only when needed. To address this issue, the HEPCloud project was launched by the Fermilab Scientific Computing Division in June 2015. Its goal is to develop a facility that provides a common interface to a variety of resources, including local clusters, grids, high performance computers, and community and commercial Clouds. Initially targeted experiments include CMS and NOvA, as well as other Fermilab stakeholders. In its first phase, the project has demonstrated the use of the “elastic” provisioning model offered by commercial clouds, such as Amazon Web Services. In this model, resources are rented and provisioned automatically over the Internet upon request. In January 2016, the project demonstrated the ability to increase the total amount of global CMS resources by 58,000 cores from 150,000 cores - a 25 percent increase - in preparation for the Recontres de Moriond. In March 2016, the NOvA experiment has also demonstrated resource burst capabilities with an additional 7,300 cores, achieving a scale almost four times as large as the local allocated resources and utilizing the local AWS s3 storage to optimize data handling operations and costs. NOvA was using the same familiar services used for local computations, such as data handling and job submission, in preparation for the Neutrino 2016 conference. In both cases, the cost was contained by the use of the Amazon Spot Instance Market and the Decision Engine, a HEPCloud component that aims at minimizing cost and job interruption. This paper

  10. A Study of Critical Flowrate in the Integral Effect Test Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yeongsik; Ryu, Sunguk; Cho, Seok; Yi, Sungjae; Park, Hyunsik [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    In earlier studies, most of the information available in the literature was either for a saturated two-phase flow or a sub-cooled water flow at medium pressure conditions, e. g., up to about 7.0 MPa. The choking is regarded as a condition of maximum possible discharge through a given orifice and/or nozzle exit area. A critical flow rate can be achieved at a choking under the given thermo-hydraulic conditions. The critical flow phenomena were studied extensively in both single-phase and two-phase systems because of its importance in the LOCA analyses of light water reactors and in the design of other engineering areas. Park suggested a modified correlation for predicting the critical flow for sub-cooled water through a nozzle. Recently, Park et al. performed an experimental study on a two-phase critical flow with a noncondensable gas at high pressure conditions. Various experiments of critical flow using sub-cooled water were performed for a modeling of break simulators in thermohydraulic integral effect test facilities for light water reactors, e. g., an advanced power reactor 1400MWe (APR1400) and a system-integrated modular advanced reactor (SMART). For the design of break simulators of SBLOCA scenarios, the aspect ratio (L/D) is considered to be a key parameter to determine the shape of a break simulator. In this paper, an investigation of critical flow phenomena was performed especially on break simulators for LOCA scenarios in the integral effect test facilities of KAERI, such as ATLAS and FESTA. In this study, various studies on the critical flow models for sub-cooled and/or saturated water were reviewed. For a comparison among the models for the selected test data, discussions of the comparisons on the effect of the diameters, predictions of critical flow models, and break simulators for SBLOCA in the integral effect test facilities were presented.

  11. The MELiSSA Pilot Plant Facility: Objectives and Integration Strategy

    Science.gov (United States)

    Gødia, F.; Pérez, J.; Albiol, J.; Lasseur, C.; Lamaze, B.; Ordónez, L.

    MELiSSA Micro-Ecological Life Support System Alternative is a closed artificial ecosystem intended as a tool for the development of a bio-regenerative life support system for long-term manned missions i e planetary base For its study and implementation the MELiSSA loop has been divided in five interconnected compartments organized in three different loops solid liquid and gas This compartments are microbial bioreactors and higher plant chambers The MELiSSA Pilot Plant facility an ESA External Laboratory located at Universitat Aut o noma of Barcelona has been conceived to achieve a preliminary terrestrial demonstration of the MELiSSA concept at pilot scale using animals as a model to substitute the crew The experience gained in the operation of such a facility will be highly relevant for planning future life support systems in Space In order to fulfill this challenging objective a number of steps have to be covered from the individual design of each compartment to the continuous operation of the complete loop with all compartments interconnected operating in sterile conditions in controlled conditions and in a biosafe manner A new site for the MELISSA Pilot Plant facility has been recently completed to host the final integration of the complete loop The contribution will cover the general design aspects of the loop including the current state of the different compartments and their interconnection with solid liquid and gas loops and the future plans of how these different elements will be integrated to achieve the final

  12. Conjunctive operation of river facilities for integrated water resources management in Korea

    Science.gov (United States)

    Kim, Hwirin; Jang, Cheolhee; Kim, Sung

    2016-10-01

    With the increasing trend of water-related disasters such as floods and droughts resulting from climate change, the integrated management of water resources is gaining importance recently. Korea has worked towards preventing disasters caused by floods and droughts, managing water resources efficiently through the coordinated operation of river facilities such as dams, weirs, and agricultural reservoirs. This has been pursued to enable everyone to enjoy the benefits inherent to the utilization of water resources, by preserving functional rivers, improving their utility and reducing the degradation of water quality caused by floods and droughts. At the same time, coordinated activities are being conducted in multi-purpose dams, hydro-power dams, weirs, agricultural reservoirs and water use facilities (featuring a daily water intake of over 100 000 m3 day-1) with the purpose of monitoring the management of such facilities. This is being done to ensure the protection of public interest without acting as an obstacle to sound water management practices. During Flood Season, each facilities contain flood control capacity by limited operating level which determined by the Regulation Council in advance. Dam flood discharge decisions are approved through the flood forecasting and management of Flood Control Office due to minimize flood damage for both upstream and downstream. The operational plan is implemented through the council's predetermination while dry season for adequate quantity and distribution of water.

  13. Conjunctive operation of river facilities for integrated water resources management in Korea

    Directory of Open Access Journals (Sweden)

    H. Kim

    2016-10-01

    Full Text Available With the increasing trend of water-related disasters such as floods and droughts resulting from climate change, the integrated management of water resources is gaining importance recently. Korea has worked towards preventing disasters caused by floods and droughts, managing water resources efficiently through the coordinated operation of river facilities such as dams, weirs, and agricultural reservoirs. This has been pursued to enable everyone to enjoy the benefits inherent to the utilization of water resources, by preserving functional rivers, improving their utility and reducing the degradation of water quality caused by floods and droughts. At the same time, coordinated activities are being conducted in multi-purpose dams, hydro-power dams, weirs, agricultural reservoirs and water use facilities (featuring a daily water intake of over 100 000 m3 day−1 with the purpose of monitoring the management of such facilities. This is being done to ensure the protection of public interest without acting as an obstacle to sound water management practices. During Flood Season, each facilities contain flood control capacity by limited operating level which determined by the Regulation Council in advance. Dam flood discharge decisions are approved through the flood forecasting and management of Flood Control Office due to minimize flood damage for both upstream and downstream. The operational plan is implemented through the council's predetermination while dry season for adequate quantity and distribution of water.

  14. Integrated Computational Solution for Predicting Skin Sensitization Potential of Molecules.

    Directory of Open Access Journals (Sweden)

    Konda Leela Sarath Kumar

    Full Text Available Skin sensitization forms a major toxicological endpoint for dermatology and cosmetic products. Recent ban on animal testing for cosmetics demands for alternative methods. We developed an integrated computational solution (SkinSense that offers a robust solution and addresses the limitations of existing computational tools i.e. high false positive rate and/or limited coverage.The key components of our solution include: QSAR models selected from a combinatorial set, similarity information and literature-derived sub-structure patterns of known skin protein reactive groups. Its prediction performance on a challenge set of molecules showed accuracy = 75.32%, CCR = 74.36%, sensitivity = 70.00% and specificity = 78.72%, which is better than several existing tools including VEGA (accuracy = 45.00% and CCR = 54.17% with 'High' reliability scoring, DEREK (accuracy = 72.73% and CCR = 71.44% and TOPKAT (accuracy = 60.00% and CCR = 61.67%. Although, TIMES-SS showed higher predictive power (accuracy = 90.00% and CCR = 92.86%, the coverage was very low (only 10 out of 77 molecules were predicted reliably.Owing to improved prediction performance and coverage, our solution can serve as a useful expert system towards Integrated Approaches to Testing and Assessment for skin sensitization. It would be invaluable to cosmetic/ dermatology industry for pre-screening their molecules, and reducing time, cost and animal testing.

  15. Computational Acoustics: Computational PDEs, Pseudodifferential Equations, Path Integrals, and All That Jazz

    Science.gov (United States)

    Fishman, Louis

    2000-11-01

    The role of mathematical modeling in the physical sciences will be briefly addressed. Examples will focus on computational acoustics, with applications to underwater sound propagation, electromagnetic modeling, optics, and seismic inversion. Direct and inverse wave propagation problems in both the time and frequency domains will be considered. Focusing on fixed-frequency (elliptic) wave propagation problems, the usual, two-way, partial differential equation formulation will be exactly reformulated, in a well-posed manner, as a one-way (marching) problem. This is advantageous for both direct and inverse considerations, as well as stochastic modeling problems. The reformulation will require the introduction of pseudodifferential operators and their accompanying phase space analysis (calculus), in addition to path integral representations for the fundamental solutions and their subsequent computational algorithms. Unlike the more traditional, purely numerical applications of, for example, finite-difference and finite-element methods, this approach, in effect, writes the exact, or, more generally, the asymptotically correct, answer as a functional integral and, subsequently, computes it directly. The overall computational philosophy is to combine analysis, asymptotics, and numerical methods to attack complicated, real-world problems. Exact and asymptotic analysis will stress the complementary nature of the direct and inverse formulations, as well as indicating the explicit structural connections between the time- and frequency-domain solutions.

  16. Tavaxy: integrating Taverna and Galaxy workflows with cloud computing support.

    Science.gov (United States)

    Abouelhoda, Mohamed; Issa, Shadi Alaa; Ghanem, Moustafa

    2012-05-04

    Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis.The system can be accessed either through a

  17. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    Directory of Open Access Journals (Sweden)

    Abouelhoda Mohamed

    2012-05-01

    Full Text Available Abstract Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub- workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and

  18. Enhanced Computational Infrastructure for Data Analysis at the DIII-D National Fusion Facility

    Energy Technology Data Exchange (ETDEWEB)

    Schissel, D.P.; Peng, Q.; Schachter, J.; Terpstra, T.B.; Casper, T.A.; Freeman, J.; Jong, R.; Keith, K.M.; Meyer, W.H.; Parker, C.T.

    1999-08-01

    Recently a number of enhancements to the computer hardware infrastructure have been implemented at the DIII-D National Fusion Facility. Utilizing these improvements to the hardware infrastructure, software enhancements are focusing on streamlined analysis, automation, and graphical user interface (GUI) systems to enlarge the user base. The adoption of the load balancing software package LSF Suite by Platform Computing has dramatically increased the availability of CPU cycles and the efficiency of their use. Streamlined analysis has been aided by the adoption of the MDSplus system to provide a unified interface to analyzed DIII-D data. The majority of MDSplus data is made available in between pulses giving the researcher critical information before setting up the next pulse. Work on data viewing and analysis tools focuses on efficient GUI design with object-oriented programming (OOP) for maximum code flexibility. Work to enhance the computational infrastructure at DIII-D has included a significant effort to aid the remote collaborator since the DIII-D National Team consists of scientists from 9 national laboratories, 19 foreign laboratories, 16 universities, and 5 industrial partnerships. As a result of this work, DIII-D data is available on a 24 x 7 basis from a set of viewing and analysis tools that can be run either on the collaborators' or DIII-Ds computer systems. Additionally, a Web based data and code documentation system has been created to aid the novice and expert user alike.

  19. SENSOR FUSION CONTROL SYSTEM FOR COMPUTER INTEGRATED MANUFACTURING

    Directory of Open Access Journals (Sweden)

    C.M. Kumile

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Manufacturing companies of today face unpredictable, high frequency market changes driven by global competition. To stay competitive, these companies must have the characteristics of cost-effective rapid response to the market needs. As an engineering discipline, mechatronics strives to integrate mechanical, electronic, and computer systems optimally in order to create high precision products and manufacturing processes. This paper presents a methodology of increasing flexibility and reusability of a generic computer integrated manufacturing (CIM cell-control system using simulation and modelling of mechatronic sensory system (MSS concepts. The utilisation of sensors within the CIM cell is highlighted specifically for data acquisition, analysis, and multi-sensor data fusion. Thus the designed reference architecture provides comprehensive insight for the functions and methodologies of a generic shop-floor control system (SFCS, which consequently enables the rapid deployment of a flexible system.

    AFRIKAANSE OPSOMMING: Hedendaagse vervaardigingsondernemings ervaar gereeld onvoorspelbare markveranderinge wat aangedryf word deur wêreldwye mededinging. Om kompeterend te bly moet hierdie ondernemings die eienskappe van kosteeffektiwiteit en snelle-respons op markfluktuasies toon. Megatronika streef daarna om meganiese, elektroniese en rekenaarstelsels optimaal te integreer om hoëpresisieprodukte en produksieprosesse daar te stel. Hierdie artikel suggereer 'n metodologie vir toenemende aanpasbaarheid en herbruikbaarheid van 'n generiese rekenaargeïntegreerde vervaardigingsel-beheersisteem deur die gebruik van simulasie en die modellering van megatroniese sensorsisteemkonsepte. Die aanwending van sensors binne die sel fasiliteer datavaslegging, ontleding en multisensordatafusie. Sodoende verskaf die ontwerpte argitektuur insig in die funksie en metodologie van 'n generiese stukwerkwinkelbeheersisteem wat die vinnige

  20. Secondary Waste Cementitious Waste Form Data Package for the Integrated Disposal Facility Performance Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Cantrell, Kirk J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Westsik, Joseph H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Serne, R Jeffrey [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Um, Wooyong [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Cozzi, Alex D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-05-16

    A review of the most up-to-date and relevant data currently available was conducted to develop a set of recommended values for use in the Integrated Disposal Facility (IDF) performance assessment (PA) to model contaminant release from a cementitious waste form for aqueous wastes treated at the Hanford Effluent Treatment Facility (ETF). This data package relies primarily upon recent data collected on Cast Stone formulations fabricated with simulants of low-activity waste (LAW) and liquid secondary wastes expected to be produced at Hanford. These data were supplemented, when necessary, with data developed for saltstone (a similar grout waste form used at the Savannah River Site). Work is currently underway to collect data on cementitious waste forms that are similar to Cast Stone and saltstone but are tailored to the characteristics of ETF-treated liquid secondary wastes. Recommended values for key parameters to conduct PA modeling of contaminant release from ETF-treated liquid waste are provided.

  1. Integrated Geo Hazard Management System in Cloud Computing Technology

    Science.gov (United States)

    Hanifah, M. I. M.; Omar, R. C.; Khalid, N. H. N.; Ismail, A.; Mustapha, I. S.; Baharuddin, I. N. Z.; Roslan, R.; Zalam, W. M. Z.

    2016-11-01

    Geo hazard can result in reducing of environmental health and huge economic losses especially in mountainous area. In order to mitigate geo-hazard effectively, cloud computer technology are introduce for managing geo hazard database. Cloud computing technology and it services capable to provide stakeholder's with geo hazards information in near to real time for an effective environmental management and decision-making. UNITEN Integrated Geo Hazard Management System consist of the network management and operation to monitor geo-hazard disaster especially landslide in our study area at Kelantan River Basin and boundary between Hulu Kelantan and Hulu Terengganu. The system will provide easily manage flexible measuring system with data management operates autonomously and can be controlled by commands to collects and controls remotely by using “cloud” system computing. This paper aims to document the above relationship by identifying the special features and needs associated with effective geohazard database management using “cloud system”. This system later will use as part of the development activities and result in minimizing the frequency of the geo-hazard and risk at that research area.

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  3. DOE standard: Integration of environment, safety, and health into facility disposition activities. Volume 1: Technical standard

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-05-01

    This Department of Energy (DOE) technical standard (referred to as the Standard) provides guidance for integrating and enhancing worker, public, and environmental protection during facility disposition activities. It provides environment, safety, and health (ES and H) guidance to supplement the project management requirements and associated guidelines contained within DOE O 430.1A, Life-Cycle Asset Management (LCAM), and amplified within the corresponding implementation guides. In addition, the Standard is designed to support an Integrated Safety Management System (ISMS), consistent with the guiding principles and core functions contained in DOE P 450.4, Safety Management System Policy, and discussed in DOE G 450.4-1, Integrated Safety Management System Guide. The ISMS guiding principles represent the fundamental policies that guide the safe accomplishment of work and include: (1) line management responsibility for safety; (2) clear roles and responsibilities; (3) competence commensurate with responsibilities; (4) balanced priorities; (5) identification of safety standards and requirements; (6) hazard controls tailored to work being performed; and (7) operations authorization. This Standard specifically addresses the implementation of the above ISMS principles four through seven, as applied to facility disposition activities.

  4. Preservice teachers’ preparedness to integrate computer technology into the curriculum

    Directory of Open Access Journals (Sweden)

    Jelena Magliaro

    2008-05-01

    Full Text Available For Canada to compete effectively in the digital world, beginning teachers need to play an important role in integrating computer technology into the curriculum. Equipment and connectivity do not guarantee successful or productive use of computers in the classroom, but the combination of the teaching style and technology use has the potential to change education. In this research, the computer self-efficacy beliefs of 210 preservice teachers after their first practice teaching placements were examined. First, the quantitative component of the study involved the use of Computer User Self-Efficacy (CUSE scale where students’ previous undergraduate degree, licensure area, experience and familiarity with software packages were found to have statistically significant effects on computer self-efficacy. Second, the qualitative data indicated that society and school were the most positive factors that influenced preservice teachers’ attitudes towards computers, while the family had the highest percentage of negative influence. Findings reveal that although preservice teachers had completed only two months of the program, those with higher CUSE scores were more ready to integrate computers into their lessons than those with lower scores. Résumé: Pour que le Canada puisse entrer en compétition dans le monde numérique, les nouveaux enseignants devront jouer un rôle important d’intégration des technologies informatiques dans le curriculum. Les équipements et la connectivité ne garantissent pas une utilisation gagnante ou productive de l’ordinateur en salle de classe, mais la combinaison de styles d’enseignement et d’usages de la technologie a le potentiel de changer l’éducation. Dans cette étude, les croyances d’auto-efficacité à l’ordinateur de 210 futurs enseignants après leur première affectation ont été examinées. Premièrement, la partie quantitative de l’étude impliquait l’utilisation de l’échelle du Computer

  5. Near-Field Hydrology Data Package for the Integrated Disposal Facility 2005 Performance Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Philip D.; Saripalli, Prasad; Freedman, Vicky L.

    2004-06-25

    CH2MHill Hanford Group, Inc. (CHG) is designing and assessing the performance of an Integrated Disposal Facility (IDF) to receive immobilized low-activity waste (ILAW), Low-Level and Mixed Low-Level Wastes (LLW/MLLW), and the Waste Treatment Plant (WTP) melters used to vitrify the ILAW. The IDF Performance Assessment (PA) assesses the performance of the disposal facility to provide a reasonable expectation that the disposal of the waste is protective of the general public, groundwater resources, air resources, surface water resources, and inadvertent intruders. The PA requires prediction of contaminant migration from the facilities, which is expected to occur primarily via the movement of water through the facilities and the consequent transport of dissolved contaminants in the pore water of the vadose zone. Pacific Northwest National Laboratory (PNNL) assists CHG in its performance assessment activities. One of PNNL’s tasks is to provide estimates of the physical, hydraulic, and transport properties of the materials comprising the disposal facilities and the disturbed region around them. These materials are referred to as the near-field materials. Their properties are expressed as parameters of constitutive models used in simulations of subsurface flow and transport. In addition to the best-estimate parameter values, information on uncertainty in the parameter values and estimates of the changes in parameter values over time are required to complete the PA. These parameter estimates and information were previously presented in a report prepared for the 2001 ILAW PA. This report updates the parameter estimates for the 2005 IDF PA using additional information and data collected since publication of the earlier report.

  6. Control computers and automation subsystem equipment in Del'fin facility

    Energy Technology Data Exchange (ETDEWEB)

    Allin, A.P.; Belen' kiy, Yu.M.; Borzyak, Yu.V.; Bykovskiy, N.E.; Grigor' ev, V.E.; Gusyatnikov, B.S.; Doroshkevich, I.L.; Ivanov, V.V.; Kuchinskiy, A.G.; Savchenko, V.M.

    1983-01-01

    The power equipment of the Del'fin laser facility contains a 10/sup 7/ J capacitor bank divided into four identical sections feeding a power preamplifier and three output stages each, with 328 IFP-20,000 flash tubes designed to produce 2.5 kJ laser radiation. The system for controlling and automating laser experiments, modeled after the SHIVA system (Lawrence Livermore Laboratory), includes a computer complex in the central console and the sequential ring bus, with CAMAC peripheral stations inside the optical chamber including three microcomputers (Polon, Nuclear Enterprise, HENESA). The control computer with a 28 K memory is linked to the CAMAC stations and to a terminal DECWRITER-II. The system crate contains a 9030/32 interface, a 3992 driver for the sequential bus, and a 064 LAM GRADER interrogation processing module. The computer complex also includes an RSX-11M multiprogram multipurpose real-time disk operating system which uses standard DECNET-11 software and includes a translator from MACRO-11 assembler language to FORTRAN-4, BASIC-11, COBOL and a few other languages. The laser power is automatically controlled through the CAMAC stations according to a main program as well as dialog maintenance programs (BCE, KASKN, KASN, DIAL, STRB, MODB, BKYM, STRB, BKYB, BKY) and measurement programs (CONTRO, CNTRO, KOD) designed to ensure simple and reliable high-speed control of laser experiments. All alignment and regulation of the laser facility is automated through optical channels (aligning LTI-501 laser, collimators, lenses, auxiliary optics) and servomechanisms (coordinate photoreceiver-homing signal module-step motors) designed for positioning and orientating mirrors 80 mm and 30 mm in diameter. 25 references, 31 figures, 2 tables.

  7. Waste Form Release Calculations for the 2005 Integrated Disposal Facility Performance Assessment. Erratum

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Gary L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-09-06

    This report refers to or contains Kg values for glasses LAWA44, LAWB45 and LAWC22 affected by calculations errors as identified by Papathanassiu et al. (2011). The corrected Kg values are reported in an erratum included in the revised version of the original report. The revised report can be referenced as follows: Pierce E. M. et al. (2004) Waste Form Release Data Package for the 2005 Integrated Disposal Facility Performance Assessment. PNNL-14805 Rev. 0 Erratum. Pacific Northwest National Laboratory, Richland, WA, USA.

  8. Waste Form Release Data Package for the 2005 Integrated Disposal Facility Performance Assessment Erratum

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Gary L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-09-06

    This report refers to or contains Kg values for glasses LAWA44, LAWB45 and LAWC22 affected by calculations errors as identified by Papathanassiu et al. (2011). The corrected Kg values are reported in an erratum included in the revised version of the original report. The revised report can be referenced as follows: Pierce E. M. et al. (2004) Waste Form Release Data Package for the 2005 Integrated Disposal Facility Performance Assessment. PNNL-14805 Rev. 0 Erratum. Pacific Northwest National Laboratory, Richland, WA, USA.

  9. Waste Form Release Calculations for the 2005 Integrated Disposal Facility Performance Assessment Erratum

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Gary L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-09-06

    This report refers to or contains Kg values for glasses LAWA44, LAWB45 and LAWC22 affected by calculations errors as identified by Papathanassiu et al. (2011). The corrected Kg values are reported in an erratum included in the revised version of the original report. The revised report can be referenced as follows: Pierce E. M. et al. (2004) Waste Form Release Data Package for the 2005 Integrated Disposal Facility Performance Assessment. PNNL-14805 Rev. 0 Erratum. Pacific Northwest National Laboratory, Richland, WA, USA.

  10. Waste Form Release Data Package for the 2005 Integrated Disposal Facility Performance Assessment. Erratum

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Gary L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-09-06

    This report refers to or contains Kg values for glasses LAWA44, LAWB45 and LAWC22 affected by calculations errors as identified by Papathanassiu et al. (2011). The corrected Kg values are reported in an erratum included in the revised version of the original report. The revised report can be referenced as follows: Pierce E. M. et al. (2004) Waste Form Release Data Package for the 2005 Integrated Disposal Facility Performance Assessment. PNNL-14805 Rev. 0 Erratum. Pacific Northwest National Laboratory, Richland, WA, USA.

  11. MPL - a program for computations with iterated integrals on moduli spaces of curves of genus zero

    CERN Document Server

    Bogner, Christian

    2015-01-01

    We introduce the computer program MPL for computations with homotopy invariant iterated integrals on moduli spaces $\\mathcal{M}_{0,n}$ of curves of genus 0 with $n$ ordered marked points. The program is an implementation of the algorithms presented in [13], based on Maple. It includes the symbol map and procedures for the analytic computation of period integrals on $\\mathcal{M}_{0,n}.$ It supports the automated computation of a certain class of Feynman integrals.

  12. Resilient and Robust High Performance Computing Platforms for Scientific Computing Integrity

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Yier [Univ. of Central Florida, Orlando, FL (United States)

    2017-07-14

    As technology advances, computer systems are subject to increasingly sophisticated cyber-attacks that compromise both their security and integrity. High performance computing platforms used in commercial and scientific applications involving sensitive, or even classified data, are frequently targeted by powerful adversaries. This situation is made worse by a lack of fundamental security solutions that both perform efficiently and are effective at preventing threats. Current security solutions fail to address the threat landscape and ensure the integrity of sensitive data. As challenges rise, both private and public sectors will require robust technologies to protect its computing infrastructure. The research outcomes from this project try to address all these challenges. For example, we present LAZARUS, a novel technique to harden kernel Address Space Layout Randomization (KASLR) against paging-based side-channel attacks. In particular, our scheme allows for fine-grained protection of the virtual memory mappings that implement the randomization. We demonstrate the effectiveness of our approach by hardening a recent Linux kernel with LAZARUS, mitigating all of the previously presented side-channel attacks on KASLR. Our extensive evaluation shows that LAZARUS incurs only 0.943% overhead for standard benchmarks, and is therefore highly practical. We also introduced HA2lloc, a hardware-assisted allocator that is capable of leveraging an extended memory management unit to detect memory errors in the heap. We also perform testing using HA2lloc in a simulation environment and find that the approach is capable of preventing common memory vulnerabilities.

  13. CAPA-An integrated computer-assisted personalized assignment system

    Science.gov (United States)

    Kashy, E.; Sherrill, B. M.; Tsai, Y.; Thaler, D.; Weinshank, D.; Engelmann, M.; Morrissey, D. J.

    1993-12-01

    A new integrated computer-assisted personalized assignment (CAPA) system that creates individual assignments for each student has been developed and found to be a powerful motivator. The CAPA system allows students to enter their answers to personalized assignments directly via networked terminals, gives immediate feedback and hints (allowing challenging questions), while providing the instructor with on-line performance information. The students are encouraged to study together which is known to be an effective learning strategy, but each must still obtain his/her own correct answers. Students are allowed to re-enter solutions to the problems before the due date without penalty, thus providing students with different skills levels the opportunity and incentive to understand the material without being judged during the learning process. The features and operation of the system are described, observations on its use in an introductory general physics class are reported, and some of the highly favorable student reactions are included.

  14. Multiscale mechanobiology: computational models for integrating molecules to multicellular systems.

    Science.gov (United States)

    Mak, Michael; Kim, Taeyoon; Zaman, Muhammad H; Kamm, Roger D

    2015-10-01

    Mechanical signals exist throughout the biological landscape. Across all scales, these signals, in the form of force, stiffness, and deformations, are generated and processed, resulting in an active mechanobiological circuit that controls many fundamental aspects of life, from protein unfolding and cytoskeletal remodeling to collective cell motions. The multiple scales and complex feedback involved present a challenge for fully understanding the nature of this circuit, particularly in development and disease in which it has been implicated. Computational models that accurately predict and are based on experimental data enable a means to integrate basic principles and explore fine details of mechanosensing and mechanotransduction in and across all levels of biological systems. Here we review recent advances in these models along with supporting and emerging experimental findings.

  15. Integrated Task Clustering, Mapping and Scheduling for Heterogeneous Computing Systems

    Directory of Open Access Journals (Sweden)

    Yuet Ming Lam

    2012-03-01

    Full Text Available This paper presents a new approach for mapping and scheduling task graphs for heterogeneous hardware/software computing systems using heuristic search. Task mapping and scheduling are vital in hardware/software codesign and previous approaches that treat them separately lead to suboptimal solutions. In this paper, we propose two techniques to enhance the speedup of mapping/scheduling solutions: (1 an integrated technique combining task clustering, mapping, and scheduling, and (2 a multiple neighborhood function strategy. Our approach is demonstrated by case studies involving 40 randomly generated task graphs, as well as six applications. Experimental results show that our proposed approach outperforms a separate approach in terms of speedup by up to 18.3% for a system with a microprocessor, a floating-point digital signal processor, and an FPGA.

  16. IOTA (Integrable Optics Test Accelerator): facility and experimental beam physics program

    Energy Technology Data Exchange (ETDEWEB)

    Antipov, S.; Broemmelsiek, D.; Bruhwiler, D.; Edstrom, D.; Harms, E.; Lebedev, V.; Leibfritz, J.; Nagaitsev, S.; Park, C. S.; Piekarz, H.; Piot, P.; Prebys, E.; Romanov, A.; Ruan, J.; Sen, T.; Stancari, G.; Thangaraj, C.; Thurman-Keup, R.; Valishev, A.; Shiltsev, V.

    2017-03-01

    The Integrable Optics Test Accelerator (IOTA) is a storage ring for advanced beam physics research currently being built and commissioned at Fermilab. It will operate with protons and electrons using injectors with momenta of 70 and 150 MeV/c, respectively. The research program includes the study of nonlinear focusing integrable optical beam lattices based on special magnets and electron lenses, beam dynamics of space-charge effects and their compensation, optical stochastic cooling, and several other experiments. In this article, we present the design and main parameters of the facility, outline progress to date and provide the timeline of the construction, commissioning and research. The physical principles, design, and hardware implementation plans for the major IOTA experiments are also discussed.

  17. Integrating Computing across the Curriculum: The Impact of Internal Barriers and Training Intensity on Computer Integration in the Elementary School Classroom

    Science.gov (United States)

    Coleman, LaToya O.; Gibson, Philip; Cotten, Shelia R.; Howell-Moroney, Michael; Stringer, Kristi

    2016-01-01

    This study examines the relationship between internal barriers, professional development, and computer integration outcomes among a sample of fourth- and fifth-grade teachers in an urban, low-income school district in the Southeastern United States. Specifically, we examine the impact of teachers' computer attitudes, computer anxiety, and computer…

  18. Integrated doses calculation in evacuation scenarios of the neutron generator facility at Missouri S&T

    Science.gov (United States)

    Sharma, Manish K.; Alajo, Ayodeji B.

    2016-08-01

    Any source of ionizing radiations could lead to considerable dose acquisition to individuals in a nuclear facility. Evacuation may be required when elevated levels of radiation is detected within a facility. In this situation, individuals are more likely to take the closest exit. This may not be the most expedient decision as it may lead to higher dose acquisition. The strategy followed in preventing large dose acquisitions should be predicated on the path that offers least dose acquisition. In this work, the neutron generator facility at Missouri University of Science and Technology was analyzed. The Monte Carlo N-Particle (MCNP) radiation transport code was used to model the entire floor of the generator's building. The simulated dose rates in the hallways were used to estimate the integrated doses for different paths leading to exits. It was shown that shortest path did not always lead to minimum dose acquisition and the approach was successful in predicting the expedient path as opposed to the approach of taking the nearest exit.

  19. Integrated doses calculation in evacuation scenarios of the neutron generator facility at Missouri S&T

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, Manish K.; Alajo, Ayodeji B., E-mail: alajoa@mst.edu

    2016-08-11

    Any source of ionizing radiations could lead to considerable dose acquisition to individuals in a nuclear facility. Evacuation may be required when elevated levels of radiation is detected within a facility. In this situation, individuals are more likely to take the closest exit. This may not be the most expedient decision as it may lead to higher dose acquisition. The strategy followed in preventing large dose acquisitions should be predicated on the path that offers least dose acquisition. In this work, the neutron generator facility at Missouri University of Science and Technology was analyzed. The Monte Carlo N-Particle (MCNP) radiation transport code was used to model the entire floor of the generator's building. The simulated dose rates in the hallways were used to estimate the integrated doses for different paths leading to exits. It was shown that shortest path did not always lead to minimum dose acquisition and the approach was successful in predicting the expedient path as opposed to the approach of taking the nearest exit.

  20. Addressing capability computing challenges of high-resolution global climate modelling at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, Valentine; Norman, Matthew; Evans, Katherine; Taylor, Mark; Worley, Patrick; Hack, James; Mayer, Benjamin

    2014-05-01

    During 2013, high-resolution climate model simulations accounted for over 100 million "core hours" using Titan at the Oak Ridge Leadership Computing Facility (OLCF). The suite of climate modeling experiments, primarily using the Community Earth System Model (CESM) at nearly 0.25 degree horizontal resolution, generated over a petabyte of data and nearly 100,000 files, ranging in sizes from 20 MB to over 100 GB. Effective utilization of leadership class resources requires careful planning and preparation. The application software, such as CESM, need to be ported, optimized and benchmarked for the target platform in order to meet the computational readiness requirements. The model configuration needs to be "tuned and balanced" for the experiments. This can be a complicated and resource intensive process, especially for high-resolution configurations using complex physics. The volume of I/O also increases with resolution; and new strategies may be required to manage I/O especially for large checkpoint and restart files that may require more frequent output for resiliency. It is also essential to monitor the application performance during the course of the simulation exercises. Finally, the large volume of data needs to be analyzed to derive the scientific results; and appropriate data and information delivered to the stakeholders. Titan is currently the largest supercomputer available for open science. The computational resources, in terms of "titan core hours" are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) and ASCR Leadership Computing Challenge (ALCC) programs, both sponsored by the U.S. Department of Energy (DOE) Office of Science. Titan is a Cray XK7 system, capable of a theoretical peak performance of over 27 PFlop/s, consists of 18,688 compute nodes, with a NVIDIA Kepler K20 GPU and a 16-core AMD Opteron CPU in every node, for a total of 299,008 Opteron cores and 18,688 GPUs offering a cumulative 560

  1. Lustre Distributed Name Space (DNE) Evaluation at the Oak Ridge Leadership Computing Facility (OLCF)

    Energy Technology Data Exchange (ETDEWEB)

    Simmons, James S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Leverman, Dustin B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Hanley, Jesse A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Oral, Sarp [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences

    2016-08-22

    This document describes the Lustre Distributed Name Space (DNE) evaluation carried at the Oak Ridge Leadership Computing Facility (OLCF) between 2014 and 2015. DNE is a development project funded by the OpenSFS, to improve Lustre metadata performance and scalability. The development effort has been split into two parts, the first part (DNE P1) providing support for remote directories over remote Lustre Metadata Server (MDS) nodes and Metadata Target (MDT) devices, while the second phase (DNE P2) addressed split directories over multiple remote MDS nodes and MDT devices. The OLCF have been actively evaluating the performance, reliability, and the functionality of both DNE phases. For these tests, internal OLCF testbed were used. Results are promising and OLCF is planning on a full DNE deployment by mid-2016 timeframe on production systems.

  2. Multilevel examination of facility characteristics, social integration, and health for older adults living in nursing homes.

    Science.gov (United States)

    Leedahl, Skye N; Chapin, Rosemary K; Little, Todd D

    2015-01-01

    Testing a model based on past research and theory, this study assessed relationships between facility characteristics (i.e., culture change efforts, social workers) and residents' social networks and social support across nursing homes; and examined relationships between multiple aspects of social integration (i.e., social networks, social capital, social engagement, social support) and mental and functional health for older adults in nursing homes. Data were collected at nursing homes using a planned missing data design with random sampling techniques. Data collection occurred at the individual-level through in-person structured interviews with older adult nursing home residents (N = 140) and at the facility-level (N = 30) with nursing home staff. The best fitting multilevel structural equation model indicated that the culture change subscale for relationships significantly predicted differences in residents' social networks. Additionally, social networks had a positive indirect relationship with mental and functional health among residents primarily via social engagement. Social capital had a positive direct relationship with both health outcomes. To predict better social integration and mental and functional health outcomes for nursing homes residents, study findings support prioritizing that close relationships exist among staff, residents, and the community as well as increased resident social engagement and social trust. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. An efficient numerical integral in three-dimensional electromagnetic field computations

    Science.gov (United States)

    Whetten, Frank L.; Liu, Kefeng; Balanis, Constantine A.

    1990-01-01

    An improved algorithm for efficiently computing a sinusoid and an exponential integral commonly encountered in method-of-moments solutions is presented. The new algorithm has been tested for accuracy and computer execution time against both numerical integration and other existing numerical algorithms, and has outperformed them. Typical execution time comparisons on several computers are given.

  4. Double crystal monochromator controlled by integrated computing on BL07A in New SUBARU, Japan

    Energy Technology Data Exchange (ETDEWEB)

    Okui, Masato, E-mail: okui@kohzu.co.jp [Kohzu Precision Co., Ltd., 2-6-15, Kurigi, Asao-ku, Kawasaki-shi, Kanagawa 215-8521 (Japan); Laboratory of Advanced Science and Technology for Industry, University of Hyogo (Japan); Yato, Naoki; Watanabe, Akinobu; Lin, Baiming; Murayama, Norio [Kohzu Precision Co., Ltd., 2-6-15, Kurigi, Asao-ku, Kawasaki-shi, Kanagawa 215-8521 (Japan); Fukushima, Sei, E-mail: FUKUSHIMA.Sei@nims.go.jp [Laboratory of Advanced Science and Technology for Industry, University of Hyogo (Japan); National Institute for Material Sciences (Japan); Kanda, Kazuhiro [Laboratory of Advanced Science and Technology for Industry, University of Hyogo (Japan)

    2016-07-27

    The BL07A beamline in New SUBARU, University of Hyogo, has been used for many studies of new materials. A new double crystal monochromator controlled by integrated computing was designed and installed in the beamline in 2014. In this report we will discuss the unique features of this new monochromator, MKZ-7NS. This monochromator was not designed exclusively for use in BL07A; on the contrary, it was designed to be installed at low cost in various beamlines to facilitate the industrial applications of medium-scale synchrotron radiation facilities. Thus, the design of the monochromator utilized common packages that can satisfy the wide variety of specifications required at different synchrotron radiation facilities. This monochromator can be easily optimized for any beamline due to the fact that a few control parameters can be suitably customized. The beam offset can be fixed precisely even if one of the two slave axes is omitted. This design reduces the convolution of mechanical errors. Moreover, the monochromator’s control mechanism is very compact, making it possible to reduce the size of the vacuum chamber can be made smaller.

  5. Integrated Computational System for Aerodynamic Steering and Visualization

    Science.gov (United States)

    Hesselink, Lambertus

    1999-01-01

    In February of 1994, an effort from the Fluid Dynamics and Information Sciences Divisions at NASA Ames Research Center with McDonnel Douglas Aerospace Company and Stanford University was initiated to develop, demonstrate, validate and disseminate automated software for numerical aerodynamic simulation. The goal of the initiative was to develop a tri-discipline approach encompassing CFD, Intelligent Systems, and Automated Flow Feature Recognition to improve the utility of CFD in the design cycle. This approach would then be represented through an intelligent computational system which could accept an engineer's definition of a problem and construct an optimal and reliable CFD solution. Stanford University's role focused on developing technologies that advance visualization capabilities for analysis of CFD data, extract specific flow features useful for the design process, and compare CFD data with experimental data. During the years 1995-1997, Stanford University focused on developing techniques in the area of tensor visualization and flow feature extraction. Software libraries were created enabling feature extraction and exploration of tensor fields. As a proof of concept, a prototype system called the Integrated Computational System (ICS) was developed to demonstrate CFD design cycle. The current research effort focuses on finding a quantitative comparison of general vector fields based on topological features. Since the method relies on topological information, grid matching and vector alignment is not needed in the comparison. This is often a problem with many data comparison techniques. In addition, since only topology based information is stored and compared for each field, there is a significant compression of information that enables large databases to be quickly searched. This report will (1) briefly review the technologies developed during 1995-1997 (2) describe current technologies in the area of comparison techniques, (4) describe the theory of our new

  6. The Integrated Computational Environment for Airbreathing Hypersonic Flight Vehicle Modeling and Design Evaluation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — An integrated computational environment for multidisciplinary, physics-based simulation and analyses of airbreathing hypersonic flight vehicles will be developed....

  7. An Integrated Review of Emoticons in Computer-Mediated Communication.

    Science.gov (United States)

    Aldunate, Nerea; González-Ibáñez, Roberto

    2016-01-01

    Facial expressions constitute a rich source of non-verbal cues in face-to-face communication. They provide interlocutors with resources to express and interpret verbal messages, which may affect their cognitive and emotional processing. Contrarily, computer-mediated communication (CMC), particularly text-based communication, is limited to the use of symbols to convey a message, where facial expressions cannot be transmitted naturally. In this scenario, people use emoticons as paralinguistic cues to convey emotional meaning. Research has shown that emoticons contribute to a greater social presence as a result of the enrichment of text-based communication channels. Additionally, emoticons constitute a valuable resource for language comprehension by providing expressivity to text messages. The latter findings have been supported by studies in neuroscience showing that particular brain regions involved in emotional processing are also activated when people are exposed to emoticons. To reach an integrated understanding of the influence of emoticons in human communication on both socio-cognitive and neural levels, we review the literature on emoticons in three different areas. First, we present relevant literature on emoticons in CMC. Second, we study the influence of emoticons in language comprehension. Finally, we show the incipient research in neuroscience on this topic. This mini review reveals that, while there are plenty of studies on the influence of emoticons in communication from a social psychology perspective, little is known about the neurocognitive basis of the effects of emoticons on communication dynamics.

  8. An Integrated Review of Emoticons in Computer-Mediated Communication

    Science.gov (United States)

    Aldunate, Nerea; González-Ibáñez, Roberto

    2017-01-01

    Facial expressions constitute a rich source of non-verbal cues in face-to-face communication. They provide interlocutors with resources to express and interpret verbal messages, which may affect their cognitive and emotional processing. Contrarily, computer-mediated communication (CMC), particularly text-based communication, is limited to the use of symbols to convey a message, where facial expressions cannot be transmitted naturally. In this scenario, people use emoticons as paralinguistic cues to convey emotional meaning. Research has shown that emoticons contribute to a greater social presence as a result of the enrichment of text-based communication channels. Additionally, emoticons constitute a valuable resource for language comprehension by providing expressivity to text messages. The latter findings have been supported by studies in neuroscience showing that particular brain regions involved in emotional processing are also activated when people are exposed to emoticons. To reach an integrated understanding of the influence of emoticons in human communication on both socio-cognitive and neural levels, we review the literature on emoticons in three different areas. First, we present relevant literature on emoticons in CMC. Second, we study the influence of emoticons in language comprehension. Finally, we show the incipient research in neuroscience on this topic. This mini review reveals that, while there are plenty of studies on the influence of emoticons in communication from a social psychology perspective, little is known about the neurocognitive basis of the effects of emoticons on communication dynamics.

  9. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  10. CORBA-based distributed software framework for the NIF integrated computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Stout, E.A. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, CA 94550 (United States)], E-mail: stout6@llnl.gov; Carey, R.W.; Estes, C.M.; Fisher, J.M.; Lagin, L.J.; Mathisen, D.G.; Reynolds, C.A.; Sanchez, R.J. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, CA 94550 (United States)

    2008-04-15

    The National Ignition Facility (NIF), currently under construction at the Lawrence Livermore National Laboratory, is a stadium-sized facility containing a 192-beam, 1.8 MJ, 500 TW, ultra-violet laser system together with a 10-meter diameter target chamber with room for nearly 100 experimental diagnostics. The NIF is operated by the Integrated Computer Control System (ICCS) which is a scalable, framework-based control system distributed over 800 computers throughout the NIF. The framework provides templates and services at multiple levels of abstraction for the construction of software applications that communicate via CORBA (Common Object Request Broker Architecture). Object-oriented software design patterns are implemented as templates and extended by application software. Developers extend the framework base classes to model the numerous physical control points and implement specializations of common application behaviors. An estimated 140,000 software objects, each individually addressable through CORBA, will be active at full scale. Many of these objects have persistent configuration information stored in a database. The configuration data is used to initialize the objects at system start-up. Centralized server programs that implement events, alerts, reservations, data archival, name service, data access, and process management provide common system wide services. At the highest level, a model-driven, distributed shot automation system provides a flexible and scalable framework for automatic sequencing of workflow for control and monitoring of NIF shots. The shot model, in conjunction with data defining the parameters and goals of an experiment, describes the steps to be performed by each subsystem in order to prepare for and fire a NIF shot. Status and usage of this distributed framework are described.

  11. CORBA-Based Distributed Software Framework for the NIF Integrated Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Stout, E A; Carey, R W; Estes, C M; Fisher, J M; Lagin, L J; Mathisen, D G; Reynolds, C A; Sanchez, R J

    2007-11-20

    The National Ignition Facility (NIF), currently under construction at the Lawrence Livermore National Laboratory, is a stadium-sized facility containing a 192-beam, 1.8 Megajoule, 500-Terawatt, ultra-violet laser system together with a 10-meter diameter target chamber with room for nearly 100 experimental diagnostics. The NIF is operated by the Integrated Computer Control System (ICCS) which is a scalable, framework-based control system distributed over 800 computers throughout the NIF. The framework provides templates and services at multiple levels of abstraction for the construction of software applications that communicate via CORBA (Common Object Request Broker Architecture). Object-oriented software design patterns are implemented as templates and extended by application software. Developers extend the framework base classes to model the numerous physical control points and implement specializations of common application behaviors. An estimated 140 thousand software objects, each individually addressable through CORBA, will be active at full scale. Many of these objects have persistent configuration information stored in a database. The configuration data is used to initialize the objects at system start-up. Centralized server programs that implement events, alerts, reservations, data archival, name service, data access, and process management provide common system wide services. At the highest level, a model-driven, distributed shot automation system provides a flexible and scalable framework for automatic sequencing of work-flow for control and monitoring of NIF shots. The shot model, in conjunction with data defining the parameters and goals of an experiment, describes the steps to be performed by each subsystem in order to prepare for and fire a NIF shot. Status and usage of this distributed framework are described.

  12. Solid secondary waste testing for maintenance of the Hanford Integrated Disposal Facility Performance Assessment - FY 2017

    Energy Technology Data Exchange (ETDEWEB)

    Nichols, Ralph L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Seitz, Roger R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Dixon, Kenneth L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-08-01

    The Waste Treatment and Immobilization Plant (WTP) at Hanford is being constructed to treat 56 million gallons of radioactive waste currently stored in underground tanks at the Hanford site. Operation of the WTP will generate several solid secondary waste (SSW) streams including used process equipment, contaminated tools and instruments, decontamination wastes, high-efficiency particulate air filters (HEPA), carbon adsorption beds, silver mordenite iodine sorbent beds, and spent ion exchange resins (IXr) all of which are to be disposed in the Integrated Disposal Facility (IDF). An applied research and development program was developed using a phased approach to incrementally develop the information necessary to support the IDF PA with each phase of the testing building on results from the previous set of tests and considering new information from the IDF PA calculations. This report contains the results from the exploratory phase, Phase 1 and preliminary results from Phase 2. Phase 3 is expected to begin in the fourth quarter of FY17.

  13. AN INTEGRATED SERVICE EXCELLENCE MODEL FOR MILITARY TEST AND EVALUATION FACILITIES

    Directory of Open Access Journals (Sweden)

    Gerhard De Coning

    2011-08-01

    Full Text Available The purpose of this article is to introduce an Integrated Service Excellence Model (ISEM for empowering the leadership core of the capital-intensive military test and evaluation facilities to provide strategic military test and evaluation services and to continuously improve service excellence by ensuring that all activities necessary to design, develop and implement a test and evaluation service are effective and efficient. In order to develop the ISEM, various management tools and productivity and quality models were identified and tested through an empirical study conducted amongst the various test and evaluation facilities’ leadership core. Solutions to financial, human resource and environmental challenges as well as quality standards were built into the ISEM. Governance principles and leadership perceptions and recommendations further contributed to the development of the ISEM.

  14. Recharge Data Package for the 2005 Integrated Disposal Facility Performance Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Fayer, Michael J.; Szecsody, Jim E.

    2004-06-30

    Pacific Northwest National Laboratory assisted CH2M Hill Hanford Group, Inc., (CHG) by providing estimates of recharge rates for current conditions and long-term scenarios involving disposal in the Integrated Disposal Facility (IDF). The IDF will be located in the 200 East Area at the Hanford Site and will receive several types of waste including immobilized low-activity waste. The recharge estimates for each scenario were derived from lysimeter and tracer data collected by the IDF PA Project and from modeling studies conducted for the project. Recharge estimates were provided for three specific site features (the surface barrier; possible barrier side slopes; and the surrounding soil) and four specific time periods (pre-Hanford; Hanford operations; surface barrier design life; post-barrier design life). CHG plans to conduct a performance assessment of the latest IDF design and call it the IDF 2005 PA; this recharge data package supports the upcoming IDF 2005 PA.

  15. Progress on the Combustion Integrated Rack Component of the Fluids and Combustion Facility

    Science.gov (United States)

    Weiland, Karen J.; Urban, Dave (Technical Monitor)

    1999-01-01

    The Fluids and Combustion Facility (FCF) is a facility-class payload planned for the International Space Station. It is designed to accommodate a wide variety of investigations encompassing most of the range of microgravity fluid physics and combustion science. The Combustion Integrated Rack component of the FCF is currently scheduled to be launched in 2003 and will operate independently until additional racks of the FCF are launched. The FCF is intended to complete between five and fifteen combustion experiments per year over its planned ten-year lifetime. Combustion arm that may be studied include laminar flames, reaction kinetics, droplet and spray combustion, flame spread, fire and fire suppressants, condensed phase organic fuel combustion, turbulent combustion, soot and polycyclic aromatic hydrocarbons, and flame-synthesized materials. Three different chamber inserts, one each for investigations of droplet, solid fuel, and gaseous fuel combustion, that can accommodate multiple experiments will be used initially so as to maximize the reuse of hardware. The current flight and flight-definition investigations are briefly described.

  16. Development of a Remote Handling System in an Integrated Pyroprocessing Facility

    Directory of Open Access Journals (Sweden)

    Hyo Jik Lee

    2013-10-01

    Full Text Available Over the course of a decade-long research programme, the Korea Atomic Energy Research Institute (KAERI has developed several remote handling systems for use in pyroprocessing research facilities. These systems are now used successfully for the operation and maintenance of processing equipment. The most recent remote handling system is the bridge-transported dual arm servo-manipulator system (BDSM, which is used for remote operation at the world’s largest pyroprocess integrated inactive demonstration facility (PRIDE. Accurate and reliable servo-control is the basic requirement for the BDSM to accomplish any given tasks successfully in a hot- cell environment. To achieve this end, the hardware and software of a digital signal processor-based remote control system were fully custom-developed and implemented to control the BDSM. To reduce the residual vibration of the BDSM, several input profiles, including input shaping, were carefully chosen and evaluated. Furthermore, a time delay controller was employed to achieve good tracking performance and systematic gain tuning. The experimental results demonstrate that the applied control algorithms are more effective than conventional approaches. The BDSM successfully completed its performance tests at a mock-up and was installed at PRIDE for real-world operation. The remote handling system at KAERI is expected to advance the actualization of pyroprocessing.

  17. BIM for existing facilities: feasibility of spectral image integration to 3D point cloud data

    Directory of Open Access Journals (Sweden)

    Amano Kinjiro

    2016-01-01

    Full Text Available Accurate geometrical and spatial information of the built environment can be accurately acquired and the resulting 3D point cloud data is required to be processed to construct the digital model, Building Information Modelling (BIM for existing facilities. Point cloud by laser scanning over the buildings and facilities has been commonly used, but the data requires external information so that any objects and materials can be correctly identified and classified. A number of advanced data processing methods have been developed, such as the use of colour information to attach semantic information. However, the accuracy of colour information depends largely on the scene environment where the image is acquired. The limited number of spectral channels on conventional RGB camera often fails to extract important information about surface material, despite spectral surface reflectance can represent a signature of the material. Hyperspectral imaging can, instead, provide precise representation of spatial and spectral information. By implementing such information to 3D point cloud, the efficiency of material detection and classification in BIM should be significantly improved. In this work, the feasibility of the image integration and discuss practical difficulties in the development.

  18. Development of a Remote Handling System in an Integrated Pyroprocessing Facility

    Directory of Open Access Journals (Sweden)

    Hyo Jik Lee

    2013-10-01

    Full Text Available Over the course of a decade-long research programme, the Korea Atomic Energy Research Institute (KAERI has developed several remote handling systems for use in pyroprocessing research facilities. These systems are now used successfully for the operation and maintenance of processing equipment. The most recent remote handling system is the bridge-transported dual arm servo-manipulator system (BDSM, which is used for remote operation at the world's largest pyroprocess integrated inactive demonstration facility (PRIDE. Accurate and reliable servo-control is the basic requirement for the BDSM to accomplish any given tasks successfully in a hotcell environment. To achieve this end, the hardware and software of a digital signal processor-based remote control system were fully custom-developed and implemented to control the BDSM. To reduce the residual vibration of the BDSM, several input profiles, including input shaping, were carefully chosen and evaluated. Furthermore, a time delay controller was employed to achieve good tracking performance and systematic gain tuning. The experimental results demonstrate that the applied control algorithms are more effective than conventional approaches. The BDSM successfully completed its performance tests at a mock-up and was installed at PRIDE for real-world operation. The remote handling system at KAERI is expected to advance the actualization of pyroprocessing.

  19. Design and Integrate Improved Systems for Nuclear Facility Ventilation and Exhaust Operations

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Murray E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-04-15

    Objective: The objective of this R&D project would complete the development of three new systems and integrate them into a single experimental effort. However, each of the three systems has stand-alone applicability across the DOE complex. At US DOE nuclear facilities, indoor air is filtered and ventilated for human occupancy, and exhaust air to the outdoor environment must be regulated and monitored. At least three technical standards address these functions, and the Los Alamos National Laboratory would complete an experimental facility to answer at least three questions: (1) Can the drag coefficient of a new Los Alamos air mixer be reduced for better operation in nuclear facility exhaust stacks? (2) Is it possible to verify the accuracy of a new dilution method for HEPA filter test facilities? (3) Is there a performance-based air flow metric (volumetric flow or mass flow) for operating HEPA filters? In summary, the three new systems are: a mixer, a diluter and a performance-based metric, respectively. The results of this project would be applicable to at least four technical standards: ANSI N13.1 Sampling and Monitoring Releases of Airborne Radioactive Substances from the Stacks and Ducts of Nuclear Facilities; ASTM F1471 Standard Test Method for Air Cleaning Performance of a High-Efficiency Particulate Air Filter System, ASME N511: In-Service Testing of Nuclear Air Treatment, Heating, Ventilating, and Air-Conditioning Systems, and ASME AG-1: Code On Nuclear Air And Gas Treatment. All of the three proposed new systems must be combined into a single experimental device (i.e. to develop a new function of the Los Alamos aerosol wind tunnel). Technical Approach: The Radiation Protection RP-SVS group at Los Alamos has an aerosol wind tunnel that was originally (2006) designed to evaluate small air samplers (cf. US EPA 40 CFR 53.42). In 2009, the tunnel was modified for exhaust stack verifications per the ANSI N13.1 standard. In 2010, modifications were started on the

  20. Using a qualitative approach for understanding hospital-affiliated integrated clinical and fitness facilities: characteristics and members' experiences.

    Science.gov (United States)

    Yang, Jingzhen; Kingsbury, Diana; Nichols, Matthew; Grimm, Kristin; Ding, Kele; Hallam, Jeffrey

    2015-06-19

    With health care shifting away from the traditional sick care model, many hospitals are integrating fitness facilities and programs into their clinical services in order to support health promotion and disease prevention at the community level. Through a series of focus groups, the present study assessed characteristics of hospital-affiliated integrated facilities located in Northeast Ohio, United States and members' experiences with respect to these facilities. Adult members were invited to participate in a focus group using a recruitment flyer. A total of 6 focus groups were conducted in 2013, each lasting one hour, ranging from 5 to 12 participants per group. The responses and discussions were recorded and transcribed verbatim, then analyzed independently by research team members. Major themes were identified after consensus was reached. The participants' average age was 57, with 56.8% currently under a doctor's care. Four major themes associated with integrated facilities and members' experiences emerged across the six focus groups: 1) facility/program, 2) social atmosphere, 3) provider, and 4) member. Within each theme, several sub-themes were also identified. A key feature of integrated facilities is the availability of clinical and fitness services "under one roof". Many participants remarked that they initially attended physical therapy, becoming members of the fitness facility afterwards, or vice versa. The participants had favorable views of and experiences with the superior physical environment and atmosphere, personal attention, tailored programs, and knowledgeable, friendly, and attentive staff. In particular, participants favored the emphasis on preventive care and the promotion of holistic health and wellness. These results support the integration of wellness promotion and programming with traditional medical care and call for the further evaluation of such a model with regard to participants' health outcomes.

  1. Computer-integrated quality management system for power stations. Computer-integriertes Qualitaetsmanagementsystem fuer Kraftwerke

    Energy Technology Data Exchange (ETDEWEB)

    Durst, K.H.; Scheurer, K.; Meinhardt, H. (Siemens AG, Offenbach (Germany). Abt. Qualitaetssicherung)

    1993-03-01

    Conventional CAQ systems, which were developed for monitoring mass production, are not very suitable for quality assurance in the construction and operation of plant and power stations. Long life products have to be monitored in plant and power station construction, which were manufactured in small batches for individually. So that the quality of these products can be monitored and can be assured economically and reliably by preventive maintenance measures, it is necessary to combine the plant documentation, repeated tests and repair or replacement measures in a 'computer-integrated quality management system'. For large complex plants, such as power stations, an operation guidance system was developed which includes all important plant information and makes it available in a user-friendly way to the concern's management. The article introduces this system. (orig.).

  2. Thermal hydraulic similarity analysis of the integral effect test facility for main steam line break events

    Energy Technology Data Exchange (ETDEWEB)

    Choi, K.Y.; Park, H.S.; Euh, D.J.; Kwon, T.S.; Baek, W.P. [Thermal Hydraulic Safety Research Division Korea Atomic Energy Research Institute 150 Dukjin-Dong, Yusong-Gu, Daejeon 305-353 (Korea, Republic of)

    2005-07-01

    Full text of publication follows: A thermal-hydraulic integral effect test facility, ATLAS (Advanced Thermal-hydraulic Test Loop for Accident Simulation), is being constructed at Korea Atomic Energy Research Institute (KAERI). The ATLAS is a 1/2 reduced height and 1/288 volume scaled test facility based on the design features of the APR1400, an evolutionary pressurized water reactor developed by Korean industry. The ATLAS will be used to get more realistic understanding of the thermal hydraulic phenomena following postulated events and to carry out performance evaluation and safety analysis of the reference plants. The MSLB (Main Steam Line Break) event is one of the representative non-LOCA events and thermalhydraulic phenomena following the event are to be investigated in the ATLAS. In this paper, thermal hydraulic similarity for MSLB events between the ATLAS and the prototype plant, APR1400 is assessed by using the MARS code, which is a multi-dimensional best-estimate thermal hydraulic code being developed by KAERI. Several cases including SLBFPLOOP and SLBFP are taken into account for similarity analysis in this paper. The neutronic effects such as moderator temperature coefficients and doppler reactivity in APR1400 are not considered in this study. The same control logics for the major sequence of events such as reactor trip, turbine trip, valve opening and actuation of the emergency cooling system are applied to the ATLAS and the APR1400. The present investigation is focused on the scaling and the reduced power effects on thermal hydraulic similarity after initiation of MSLB events. It is found that the ATLAS facility has the similar thermal hydraulic responses against the MSLB events. However, the initial high secondary pressure before the MSLB initiation resulted in different primary pressure and temperature progression from the APR1400. The break flow from the main steam line is found to be one of the most dominating parameters governing the transient

  3. Self-Adaptive Filon's Integration Method and Its Application toComputing Synthetic Seismograms

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hai-Ming; CHEN Xiao-Fei

    2001-01-01

    Based on the principle of the self-adaptive Simpson integration method, and by incorporating the ‘fifth-order'Filon's integration algorithm [Bull. Seism. Soc. Am. 73(1983)913], we have proposed a simple and efficient numerical integration method, i.e., the self-adaptive Filon's integration method (SAFIM), for computing synthetic seismograms at large epicentral distances. With numerical examples, we have demonstrated that the SAFIM is not only accurate but also very efficient. This new integration method is expected to be very useful in seismology,as well as in computing similar oscillatory integrals in other branches of physics.

  4. Conceptual framework for the integration of computer-aided design and computer-aided process planning

    Energy Technology Data Exchange (ETDEWEB)

    Li, R.K.

    1986-01-01

    This research presents a conceptual framework for the integration of Computer-Aided Design (CAD) and Computer-Aided Process Planning (CAPP). The conceptual framework resides in an environmental of N CAD systems and M CAPP systems. It consists of three major modules: a generic-part definition data structure, a preprocessor, and a postprocessor. The generic-part definition data structure was developed to serve as a neutral-part definition data representation between CAD and CAPP systems. With this structure, the number of interfacing systems can be reduced to 1 + M systems. The preprocessor, a part feature recognition system, is designed to extract part definition data from an IGES file, evaluates that data, allows inclusion of unsupported data, and finally puts the data into the data structure. The postprocessor was written to convert the data from the data structure to the part input format of a selected CAPP system. A prototype systems that uses IBM's CAD package (CADAM), IGES and United Technologies Research Center's CAPP package (CMPP) was developed to test and prove the concept of this research. The input is a CADAM graphic design file and the outputs are a summary of operations and a tolerance control chart which are ready to be used in the production shops.

  5. 3D Vectorial Time Domain Computational Integrated Photonics

    Energy Technology Data Exchange (ETDEWEB)

    Kallman, J S; Bond, T C; Koning, J M; Stowell, M L

    2007-02-16

    The design of integrated photonic structures poses considerable challenges. 3D-Time-Domain design tools are fundamental in enabling technologies such as all-optical logic, photonic bandgap sensors, THz imaging, and fast radiation diagnostics. Such technologies are essential to LLNL and WFO sponsors for a broad range of applications: encryption for communications and surveillance sensors (NSA, NAI and IDIV/PAT); high density optical interconnects for high-performance computing (ASCI); high-bandwidth instrumentation for NIF diagnostics; micro-sensor development for weapon miniaturization within the Stockpile Stewardship and DNT programs; and applications within HSO for CBNP detection devices. While there exist a number of photonics simulation tools on the market, they primarily model devices of interest to the communications industry. We saw the need to extend our previous software to match the Laboratory's unique emerging needs. These include modeling novel material effects (such as those of radiation induced carrier concentrations on refractive index) and device configurations (RadTracker bulk optics with radiation induced details, Optical Logic edge emitting lasers with lateral optical inputs). In addition we foresaw significant advantages to expanding our own internal simulation codes: parallel supercomputing could be incorporated from the start, and the simulation source code would be accessible for modification and extension. This work addressed Engineering's Simulation Technology Focus Area, specifically photonics. Problems addressed from the Engineering roadmap of the time included modeling the Auston switch (an important THz source/receiver), modeling Vertical Cavity Surface Emitting Lasers (VCSELs, which had been envisioned as part of fast radiation sensors), and multi-scale modeling of optical systems (for a variety of applications). We proposed to develop novel techniques to numerically solve the 3D multi-scale propagation problem for both the

  6. NOMINATION FOR THE PROJECT MANAGEMENT INSTITUTE (PMI) PROJECT OF THE YEAR AWARD INTEGRATED DISPOSAL FACILITY (IDF)

    Energy Technology Data Exchange (ETDEWEB)

    MCLELLAN, G.W.

    2007-02-07

    CH2M HILL Hanford Group, Inc. (CH2M HILL) is pleased to nominate the Integrated Disposal Facility (IDF) project for the Project Management Institute's consideration as 2007 Project of the Year, Built for the U.S, Department of Energy's (DOE) Office of River Protection (ORP) at the Hanford Site, the IDF is the site's first Resource Conservation and Recovery Act (RCRA)-compliant disposal facility. The IDF is important to DOE's waste management strategy for the site. Effective management of the IDF project contributed to the project's success. The project was carefully managed to meet three Tri-Party Agreement (TPA) milestones. The completed facility fully satisfied the needs and expectations of the client, regulators and stakeholders. Ultimately, the project, initially estimated to require 48 months and $33.9 million to build, was completed four months ahead of schedule and $11.1 million under budget. DOE directed construction of the IDF to provide additional capacity for disposing of low-level radioactive and mixed (i.e., radioactive and hazardous) solid waste. The facility needed to comply with federal and Washington State environmental laws and meet TPA milestones. The facility had to accommodate over one million cubic yards of the waste material, including immobilized low-activity waste packages from the Waste Treatment Plant (WTP), low-level and mixed low-level waste from WTP failed melters, and alternative immobilized low-activity waste forms, such as bulk-vitrified waste. CH2M HILL designed and constructed a disposal facility with a redundant system of containment barriers and a sophisticated leak-detection system. Built on a 168-area, the facility's construction met all regulatory requirements. The facility's containment system actually exceeds the state's environmental requirements for a hazardous waste landfill. Effective management of the IDF construction project required working through highly political and legal

  7. Geochemical Data Package for the 2005 Hanford Integrated Disposal Facility Performance Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Krupka, Kenneth M.; Serne, R JEFFREY.; Kaplan, D I.

    2004-09-30

    CH2M HILL Hanford Group, Inc. (CH2M HILL) is designing and assessing the performance of an integrated disposal facility (IDF) to receive low-level waste (LLW), mixed low-level waste (MLLW), immobilized low-activity waste (ILAW), and failed or decommissioned melters. The CH2M HILL project to assess the performance of this disposal facility is the Hanford IDF Performance Assessment (PA) activity. The goal of the Hanford IDF PA activity is to provide a reasonable expectation that the disposal of the waste is protective of the general public, groundwater resources, air resources, surface-water resources, and inadvertent intruders. Achieving this goal will require prediction of contaminant migration from the facilities. This migration is expected to occur primarily via the movement of water through the facilities, and the consequent transport of dissolved contaminants in the vadose zone to groundwater where contaminants may be re-introduced to receptors via drinking water wells or mixing in the Columbia River. Pacific Northwest National Laboratory (PNNL) assists CH2M HILL in their performance assessment activities. One of the PNNL tasks is to provide estimates of the geochemical properties of the materials comprising the IDF, the disturbed region around the facility, and the physically undisturbed sediments below the facility (including the vadose zone sediments and the aquifer sediments in the upper unconfined aquifer). The geochemical properties are expressed as parameters that quantify the adsorption of contaminants and the solubility constraints that might apply for those contaminants that may exceed solubility constraints. The common parameters used to quantify adsorption and solubility are the distribution coefficient (Kd) and the thermodynamic solubility product (Ksp), respectively. In this data package, we approximate the solubility of contaminants using a more simplified construct, called the solution concentration limit, a constant value. The Kd values and

  8. LRS2: the new facility low resolution integral field spectrograph for the Hobby-Eberly Telescope

    CERN Document Server

    Chonis, Taylor S; Lee, Hanshin; Tuttle, Sarah E; Vattiat, Brian L

    2014-01-01

    The second generation Low Resolution Spectrograph (LRS2) is a new facility instrument for the Hobby-Eberly Telescope (HET). Based on the design of the Visible Integral-field Replicable Unit Spectrograph (VIRUS), which is the new flagship instrument for carrying out the HET Dark Energy Experiment (HETDEX), LRS2 provides integral field spectroscopy for a seeing-limited field of 12 x 6 arcseconds. For LRS2, the replicable design of VIRUS has been leveraged to gain broad wavelength coverage from 370 nm to 1 micron, spread between two fiber-fed dual-channel spectrographs, each of which can operate as an independent instrument. The blue spectrograph, LRS2-B, covers 370-470 nm and 460-700 nm at fixed resolving powers of ~1900 and ~1100, respectively, while the red spectrograph, LRS2-R, covers 650-842 nm and 818-1050 nm with both of its channels having a resolving power of ~1800. In this paper, we present a detailed description of the instrument's design in which we focus on the departures from the basic VIRUS framew...

  9. Integrated, long term, sustainable, cost effective biosolids management at a large Canadian wastewater treatment facility

    Energy Technology Data Exchange (ETDEWEB)

    LeBlance, R.J.; Allain, C.J.; Laughton, P.J.; Henry, J.G.

    2003-07-01

    The Greater Moncton Sewerage Commission's 115 000 m{sup 3}/d advanced, chemically assisted primary wastewater treatment facility located in New Brunswick, Canada, has developed an integrated, long term, sustainable, cost effective programme for the management and beneficial utilization of biosolids from lime stabilized raw sludge. The paper overviews biosolids production, lime stabilization, conveyance, and odour control followed by an indepth discussion of the wastewater sludge as a resource programme, namely: composting, mine site reclamation, landfill cover, land application for agricultural use, tree farming, sod farm base as a soil enrichment, topsoil manufacturing. The paper also addresses the issues of metals, pathogens, organic compounds, the quality control program along with the regulatory requirements. Biosolids capital and operating costs are presented. Research results on removal of metals from primary sludge using a unique biological process known as BIOSOL as developed by the University of Toronto, Canada to remove metals and destroy pathogens are presented. The paper also discusses an ongoing cooperative research project with the Universite de Moncton where various mixtures of plant biosolids are composted with low quality soil. Integration, approach to sustainability and ''cumulative effects'' as part of the overall biosolids management strategy is also discussed. (author)

  10. EXPERIMENTAL AND COMPUTATIONAL ACTIVITIES AT THE OREGON STATE UNIVERSITY NEES TSUNAMI RESEARCH FACILITY

    Directory of Open Access Journals (Sweden)

    S.C. Yim

    2009-01-01

    Full Text Available A diverse series of research projects have taken place or are underway at the NEES Tsunami Research Facility at Oregon State University. Projects range from the simulation of the processes and effects of tsunamis generated by sub-aerial and submarine landslides (NEESR, Georgia Tech., model comparisons of tsunami wave effects on bottom profiles and scouring (NEESR, Princeton University, model comparisons of wave induced motions on rigid and free bodies (Shared-Use, Cornell, numerical model simulations and testing of breaking waves and inundation over topography (NEESR, TAMU, structural testing and development of standards for tsunami engineering and design (NEESR, University of Hawaii, and wave loads on coastal bridge structures (non-NEES, to upgrading the two-dimensional wave generator of the Large Wave Flume. A NEESR payload project (Colorado State University was undertaken that seeks to improve the understanding of the stresses from wave loading and run-up on residential structures. Advanced computational tools for coupling fluid-structure interaction including turbulence, contact and impact are being developed to assist with the design of experiments and complement parametric studies. These projects will contribute towards understanding the physical processes that occur during earthquake generated tsunamis including structural stress, debris flow and scour, inundation and overland flow, and landslide generated tsunamis. Analytical and numerical model development and comparisons with the experimental results give engineers additional predictive tools to assist in the development of robust structures as well as identification of hazard zones and formulation of hazard plans.

  11. Assess and improve the sustainability of water treatment facility using Computational Fluid Dynamics

    Science.gov (United States)

    Zhang, Jie; Tejada-Martinez, Andres; Lei, Hongxia; Zhang, Qiong

    2016-11-01

    Fluids problems in water treatment industry are often simplified or omitted since the focus is usually on chemical process only. However hydraulics also plays an important role in determining effluent water quality. Recent studies have demonstrated that computational fluid dynamics (CFD) has the ability to simulate the physical and chemical processes in reactive flows in water treatment facilities, such as in chlorine and ozone disinfection tanks. This study presents the results from CFD simulations of reactive flow in an existing full-scale ozone disinfection tank and in potential designs. Through analysis of the simulation results, we found that baffling factor and CT10 are not optimal indicators of disinfection performance. We also found that the relationship between effluent CT (the product of disinfectant concentration and contact time) obtained from CT transport simulation and baffling factor depends on the location of ozone release. In addition, we analyzed the environmental and economic impacts of ozone disinfection tank designs and developed a composite indicator to quantify the sustainability of ozone disinfection tank in technological, environmental and economic dimensions.

  12. The Invisible Barrier to Integrating Computer Technology in Education

    Science.gov (United States)

    Aflalo, Ester

    2014-01-01

    The article explores contradictions in teachers' perceptions regarding the place of computer technologies in education. The research population included 47 teachers who have incorporated computers in the classroom for several years. The teachers expressed positive attitudes regarding the decisive importance of computer technologies in furthering…

  13. Airborne gravimetry used in precise geoid computations by ring integration

    DEFF Research Database (Denmark)

    Kearsley, A.H.W.; Forsberg, René; Olesen, Arne Vestergaard

    1998-01-01

    Two detailed geoids have been computed in the region of North Jutland. The first computation used marine data in the offshore areas. For the second computation the marine data set was replaced by the sparser airborne gravity data resulting from the AG-MASCO campaign of September 1996. The results...

  14. Integral representations for computing real parabolic cylinder functions

    NARCIS (Netherlands)

    Gil, A.; Segura, J.; Temme, N.M.

    2005-01-01

    Integral representations are derived for the parabolic cylinder functions U(a,x), V(a,x) and W(a,x) and their derivatives. The new integrals will be used in numerical algorithms based on quadrature. They follow from contour integrals in the complex plane, by using methods from asymptotic analysis (s

  15. Human Computation An Integrated Approach to Learning from the Crowd

    CERN Document Server

    Law, Edith

    2011-01-01

    Human computation is a new and evolving research area that centers around harnessing human intelligence to solve computational problems that are beyond the scope of existing Artificial Intelligence (AI) algorithms. With the growth of the Web, human computation systems can now leverage the abilities of an unprecedented number of people via the Web to perform complex computation. There are various genres of human computation applications that exist today. Games with a purpose (e.g., the ESP Game) specifically target online gamers who generate useful data (e.g., image tags) while playing an enjoy

  16. Airborne gravimetry used in precise geoid computations by ring integration

    DEFF Research Database (Denmark)

    Kearsley, A.H.W.; Forsberg, René; Olesen, Arne Vestergaard

    1998-01-01

    Two detailed geoids have been computed in the region of North Jutland. The first computation used marine data in the offshore areas. For the second computation the marine data set was replaced by the sparser airborne gravity data resulting from the AG-MASCO campaign of September 1996. The results...... of comparisons of the geoid heights at on-shore geometric control showed that the geoid heights computed from the airborne gravity data matched in precision those computed using the marine data, supporting the view that airborne techniques have enormous potential for mapping those unsurveyed areas between...

  17. A Model for Integrating Computation in Undergraduate Physics: An example from middle-division classical mechanics

    CERN Document Server

    Caballero, Marcos D

    2013-01-01

    Much of the research done by modern physicists would be impossible without the use of computation. And yet, while computation is a crucial tool of practicing physicists, physics curricula do not generally reflect its importance and utility. To more tightly connect undergraduate preparation with professional practice, we integrated computational instruction into middle-division classical mechanics at the University of Colorado Boulder. Our model for integration includes the construction of computational learning goals, the design of computational activities consistent with those goals, and the assessment of students' computational fluency. To assess students' computational fluency, we used open-ended computational projects in which students prepared reports describing a physical problem of their choosing. Many students chose projects from outside the domain of the course, and therefore, had to employ mathematical and computational techniques they had not yet been taught. After completing the project, most stud...

  18. Fuzzy model of the computer integrated decision support and management system in mineral processing

    OpenAIRE

    Miljanović Igor; Vujić Slobodan

    2008-01-01

    During the research on the subject of computer integrated systems for decision making and management support in mineral processing based on fuzzy logic, realized at the Department of Applied Computing and System Engineering of the Faculty of Mining and Geology, University of Belgrade, for the needs of doctoral thesis of the first author, and wider demands of the mineral industry, the incompleteness of the developed and contemporary computer integrated systems fuzzy models was noticed. The pap...

  19. Teacher Perspectives on the Current State of Computer Technology Integration into the Public School Classroom

    Science.gov (United States)

    Zuniga, Ramiro

    2009-01-01

    Since the introduction of computers into the public school arena over forty years ago, educators have been convinced that the integration of computer technology into the public school classroom will transform education. Joining educators are state and federal governments. Public schools and others involved in the process of computer technology…

  20. X-ray facility for the ground calibration of the X-ray monitor JEM-X on board INTEGRAL

    DEFF Research Database (Denmark)

    Loffredo, G.; Pelliciari, C.; Frontera, F.;

    2003-01-01

    We describe the X-ray facility developed for the calibration of the X-ray monitor JEM-X on board the INTEGRAL satellite. The apparatus allowed the scanning of the detector geometric area with a pencil beam of desired energy over the major part of the passband of the instrument. The monochromatic...

  1. Development of an integrated methodology in support of remaining life assessment of Co-60 based gamma irradiation facilities

    NARCIS (Netherlands)

    Varde, P. V.; Joshi, N. S.; Agarwal, Mayank; Shrivastava, Amit; Bandi, L. N.; Kohli, A. K.

    2011-01-01

    An integrated approach has been developed for the remaining life assessment of an irradiation facility. The probabilistic safety assessment (PSA) was performed for the plant to develop the quantified indicators of safety for various situations. PSA results were used to identify and prioritize the sy

  2. Computation of form factors in massless QCD with finite master integrals

    Science.gov (United States)

    von Manteuffel, Andreas; Panzer, Erik; Schabinger, Robert M.

    2016-06-01

    We present the bare one-, two-, and three-loop form factors in massless quantum chromodynamics as linear combinations of finite master integrals. Using symbolic integration, we compute their ɛ expansions and thereby reproduce all known results with an independent method. Remarkably, in our finite basis, only integrals with a less-than-maximal number of propagators contribute to the cusp anomalous dimensions. We report on indications of this phenomenon at four loops, including the result for a finite, irreducible, twelve-propagator form factor integral. Together with this article, we provide our automated software setup for the computation of finite master integrals.

  3. On the Computation of Form Factors in Massless QCD with Finite Master Integrals

    CERN Document Server

    von Manteuffel, Andreas; Schabinger, Robert M

    2015-01-01

    We present the bare one-, two-, and three-loop form factors in massless Quantum Chromodynamics as linear combinations of finite master integrals. Using symbolic integration, we compute their $\\epsilon$ expansions and thereby reproduce all known results with an independent method. Remarkably, in our finite basis, only integrals with a less-than-maximal number of propagators contribute to the cusp anomalous dimensions. We report on indications of this phenomenon at four loops, including the result for a finite, irreducible, twelve-propagator form factor integral. Together with this article, we provide our automated software setup for the computation of finite master integrals.

  4. Unstructured Computational Aerodynamics on Many Integrated Core Architecture

    KAUST Repository

    Al Farhan, Mohammed A.

    2016-06-08

    Shared memory parallelization of the flux kernel of PETSc-FUN3D, an unstructured tetrahedral mesh Euler flow code previously studied for distributed memory and multi-core shared memory, is evaluated on up to 61 cores per node and up to 4 threads per core. We explore several thread-level optimizations to improve flux kernel performance on the state-of-the-art many integrated core (MIC) Intel processor Xeon Phi “Knights Corner,” with a focus on strong thread scaling. While the linear algebraic kernel is bottlenecked by memory bandwidth for even modest numbers of cores sharing a common memory, the flux kernel, which arises in the control volume discretization of the conservation law residuals and in the formation of the preconditioner for the Jacobian by finite-differencing the conservation law residuals, is compute-intensive and is known to exploit effectively contemporary multi-core hardware. We extend study of the performance of the flux kernel to the Xeon Phi in three thread affinity modes, namely scatter, compact, and balanced, in both offload and native mode, with and without various code optimizations to improve alignment and reduce cache coherency penalties. Relative to baseline “out-of-the-box” optimized compilation, code restructuring optimizations provide about 3.8x speedup using the offload mode and about 5x speedup using the native mode. Even with these gains for the flux kernel, with respect to execution time the MIC simply achieves par with optimized compilation on a contemporary multi-core Intel CPU, the 16-core Sandy Bridge E5 2670. Nevertheless, the optimizations employed to reduce the data motion and cache coherency protocol penalties of the MIC are expected to be of value for CFD and many other unstructured applications as many-core architecture evolves. We explore large-scale distributed-shared memory performance on the Cray XC40 supercomputer, to demonstrate that optimizations employed on Phi hybridize to this context, where each of

  5. Wind Energy Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Laurie, Carol

    2017-02-01

    This book takes readers inside the places where daily discoveries shape the next generation of wind power systems. Energy Department laboratory facilities span the United States and offer wind research capabilities to meet industry needs. The facilities described in this book make it possible for industry players to increase reliability, improve efficiency, and reduce the cost of wind energy -- one discovery at a time. Whether you require blade testing or resource characterization, grid integration or high-performance computing, Department of Energy laboratory facilities offer a variety of capabilities to meet your wind research needs.

  6. Wind Energy Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Office of Energy Efficiency and Renewable Energy

    2017-02-01

    This book takes readers inside the places where daily discoveries shape the next generation of wind power systems. Energy Department laboratory facilities span the United States and offer wind research capabilities to meet industry needs. The facilities described in this book make it possible for industry players to increase reliability, improve efficiency, and reduce the cost of wind energy -- one discovery at a time. Whether you require blade testing or resource characterization, grid integration or high-performance computing, Department of Energy laboratory facilities offer a variety of capabilities to meet your wind research needs.

  7. Monitoring System for Storm Readiness and Recovery of Test Facilities: Integrated System Health Management (ISHM) Approach

    Science.gov (United States)

    Figueroa, Fernando; Morris, Jon; Turowski, Mark; Franzl, Richard; Walker, Mark; Kapadia, Ravi; Venkatesh, Meera; Schmalzel, John

    2010-01-01

    Severe weather events are likely occurrences on the Mississippi Gulf Coast. It is important to rapidly diagnose and mitigate the effects of storms on Stennis Space Center's rocket engine test complex to avoid delays to critical test article programs, reduce costs, and maintain safety. An Integrated Systems Health Management (ISHM) approach and technologies are employed to integrate environmental (weather) monitoring, structural modeling, and the suite of available facility instrumentation to provide information for readiness before storms, rapid initial damage assessment to guide mitigation planning, and then support on-going assurance as repairs are effected and finally support recertification. The system is denominated Katrina Storm Monitoring System (KStorMS). Integrated Systems Health Management (ISHM) describes a comprehensive set of capabilities that provide insight into the behavior the health of a system. Knowing the status of a system allows decision makers to effectively plan and execute their mission. For example, early insight into component degradation and impending failures provides more time to develop work around strategies and more effectively plan for maintenance. Failures of system elements generally occur over time. Information extracted from sensor data, combined with system-wide knowledge bases and methods for information extraction and fusion, inference, and decision making, can be used to detect incipient failures. If failures do occur, it is critical to detect and isolate them, and suggest an appropriate course of action. ISHM enables determining the condition (health) of every element in a complex system-of-systems or SoS (detect anomalies, diagnose causes, predict future anomalies), and provide data, information, and knowledge (DIaK) to control systems for safe and effective operation. ISHM capability is achieved by using a wide range of technologies that enable anomaly detection, diagnostics, prognostics, and advise for control: (1

  8. Gesture Recognition by Computer Vision: An Integral Approach

    NARCIS (Netherlands)

    Lichtenauer, J.F.

    2009-01-01

    The fundamental objective of this Ph.D. thesis is to gain more insight into what is involved in the practical application of a computer vision system, when the conditions of use cannot be controlled completely. The basic assumption is that research on isolated aspects of computer vision often leads

  9. Integration of Computers into a Course on Biostatistics

    Science.gov (United States)

    Gjerde, Craig L.

    1977-01-01

    The biostatistics course for undergraduate medical and dental students at the University of Connecticut Health Center is taught by the Keller Plan, and students can use computers to analyze data sets and to score their unit tests. The computer is an essential tool for data analysis and an attractive option for test scoring. (LBH)

  10. Gesture Recognition by Computer Vision: An Integral Approach

    NARCIS (Netherlands)

    Lichtenauer, J.F.

    2009-01-01

    The fundamental objective of this Ph.D. thesis is to gain more insight into what is involved in the practical application of a computer vision system, when the conditions of use cannot be controlled completely. The basic assumption is that research on isolated aspects of computer vision often leads

  11. Integration of Computers into a Course on Biostatistics

    Science.gov (United States)

    Gjerde, Craig L.

    1977-01-01

    The biostatistics course for undergraduate medical and dental students at the University of Connecticut Health Center is taught by the Keller Plan, and students can use computers to analyze data sets and to score their unit tests. The computer is an essential tool for data analysis and an attractive option for test scoring. (LBH)

  12. Integration of distributed computing into the drug discovery process.

    Science.gov (United States)

    von Korff, Modest; Rufener, Christian; Stritt, Manuel; Freyss, Joel; Bär, Roman; Sander, Thomas

    2011-02-01

    Grid computing offers an opportunity to gain massive computing power at low costs. We give a short introduction into the drug discovery process and exemplify the use of grid computing for image processing, docking and 3D pharmacophore descriptor calculations. The principle of a grid and its architecture are briefly explained. More emphasis is laid on the issues related to a company-wide grid installation and embedding the grid into the research process. The future of grid computing in drug discovery is discussed in the expert opinion section. Most needed, besides reliable algorithms to predict compound properties, is embedding the grid seamlessly into the discovery process. User friendly access to powerful algorithms without any restrictions, that is, by a limited number of licenses, has to be the goal of grid computing in drug discovery.

  13. Optimizing Computation of Repairs from Active Integrity Constraints

    DEFF Research Database (Denmark)

    Cruz-Filipe, Luís

    2014-01-01

    Active integrity constraints (AICs) are a form of integrity constraints for databases that not only identify inconsistencies, but also suggest how these can be overcome. The semantics for AICs defines different types of repairs, but deciding whether an inconsistent database can be repaired...

  14. Research on Data Integration Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Yanxia Wang

    2013-06-01

    Full Text Available In this study, we give some strategies for selecting the data source and several methods for data integration after analyzing the problems of sharing information resources among universities. According to the characteristics of various types of information resources in the university website, we propose the data integration framework model which combines virtual view method with data warehouse method.

  15. Integrative approach for wastewater treatment facilities with biomass transformation into energy

    Directory of Open Access Journals (Sweden)

    Anker Yaakov

    2017-01-01

    Full Text Available Current industrial environmental regulations favor processes with Integrative Pollution Prevention and Control (IPPC. While several systems are regarded by different international directives as IPPC Best Available Techniques or Technologies (BAT, none of these systems are capable handling various pollutants of both gaseous and aquatic effluents. Additional hinder to a BAT-IPPC complete procedure are hazardous or uneconomical byproducts of the IPPC processes and significant auxiliary costs for consumables and energy. The current research and subsequent projects are aimed to the development of a Biological Integrative Pollution Prevention and Control (Bio-IPPC system. Such system can be incorporated in various industrial processes, in a way that the byproduct is without hazardous potential and may be used as an economical raw material. The main initiative and heart of these systems is a micro-algae reactor, which is capable of treating various types of industrial pollutants both in the gaseous and aquatic phases. The algae nutrition is through thin-film circulation of the aquatic effluent and the reactor atmosphere is enriched by flue gases. The excessive algal biomass may be utilized for economic purposes starting with animal feedstock, through organic fertilizer and as industrial raw material for biofuels production or direct energy production. The first industrial project is a wastewater (WW polishing stage to an industry zone WW treatment facility, which ensures high level effluent purification and assimilation of greenhouse gases, which are released during the WW bioremediation process. The second industrial application aims to treat aquatic and gaseous effluents from coal propelled power plants. The raw algal material from both projects although very different, is used for the development of new efficient scheme for bioethanol production. In summary, the system presented is an actual Bio-IPPC that can interactively treat several industrial

  16. Waste Form Release Data Package for the 2005 Integrated Disposal Facility Performance Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pierce, Eric M.; McGrail, B. Peter; Rodriguez, Elsa A.; Schaef, Herbert T.; Saripalli, Prasad; Serne, R. Jeffrey; Krupka, Kenneth M.; Martin, P. F.; Baum, Steven R.; Geiszler, Keith N.; Reed, Lunde R.; Shaw, Wendy J.

    2004-09-01

    This data package documents the experimentally derived input data on the representative waste glasses; LAWA44, LAWB45, and LAWC22. This data will be used for Subsurface Transport Over Reactive Multi-phases (STORM) simulations of the Integrated Disposal Facility (IDF) for immobilized low-activity waste (ILAW). The STORM code will be used to provide the near-field radionuclide release source term for a performance assessment to be issued in July 2005. Documented in this data package are data related to 1) kinetic rate law parameters for glass dissolution, 2) alkali (Na+)-hydrogen (H+) ion exchange rate, 3) chemical reaction network of secondary phases that form in accelerated weathering tests, and 4) thermodynamic equilibrium constants assigned to these secondary phases. The kinetic rate law and Na+-H+ ion exchange rate were determined from single-pass flow-through experiments. Pressurized unsaturated flow (PUF) and product consistency (PCT) tests where used for accelerated weathering or aging of the glasses in order to determine a chemical reaction network of secondary phases that form. The majority of the thermodynamic data used in this data package were extracted from the thermody-namic database package shipped with the geochemical code EQ3/6, version 8.0. Because of the expected importance of 129I release from secondary waste streams being sent to IDF from various thermal treatment processes, parameter estimates for diffusional release and solubility-controlled release from cementitious waste forms were estimated from the available literature.

  17. Velo and REXAN - Integrated Data Management and High Speed Analysis for Experimental Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin; Carson, James P.; Corrigan, Abigail L.; Einstein, Daniel R.; Guillen, Zoe C.; Heath, Brandi S.; Kuprat, Andrew P.; Lanekoff, Ingela T.; Lansing, Carina S.; Laskin, Julia; Li, Dongsheng; Liu, Yan; Marshall, Matthew J.; Miller, Erin A.; Orr, Galya; Pinheiro da Silva, Paulo; Ryu, Seun; Szymanski, Craig J.; Thomas, Mathew

    2013-01-10

    The Chemical Imaging Initiative at the Pacific Northwest National Laboratory (PNNL) is creating a ‘Rapid Experimental Analysis’ (REXAN) Framework, based on the concept of reusable component libraries. REXAN allows developers to quickly compose and customize high throughput analysis pipelines for a range of experiments, as well as supporting the creation of multi-modal analysis pipelines. In addition, PNNL has coupled REXAN with its collaborative data management and analysis environment Velo to create an easy to use data management and analysis environments for experimental facilities. This paper will discuss the benefits of Velo and REXAN in the context of three examples: PNNL High Resolution Mass Spectrometry - reducing analysis times from hours to seconds, and enabling the analysis of much larger data samples (100KB to 40GB) at the same time · ALS X-Ray tomography - reducing analysis times of combined STXM and EM data collected at the ALS from weeks to minutes, decreasing manual work and increasing data volumes that can be analysed in a single step ·Multi-modal nano-scale analysis of STXM and TEM data - providing a semi automated process for particle detection The creation of REXAN has significantly shortened the development time for these analysis pipelines. The integration of Velo and REXAN has significantly increased the scientific productivity of the instruments and their users by creating easy to use data management and analysis environments with greatly reduced analysis times and improved analysis capabilities.

  18. A Scheme for Verification on Data Integrity in Mobile Multicloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Laicheng Cao

    2016-01-01

    Full Text Available In order to verify the data integrity in mobile multicloud computing environment, a MMCDIV (mobile multicloud data integrity verification scheme is proposed. First, the computability and nondegeneracy of verification can be obtained by adopting BLS (Boneh-Lynn-Shacham short signature scheme. Second, communication overhead is reduced based on HVR (Homomorphic Verifiable Response with random masking and sMHT (sequence-enforced Merkle hash tree construction. Finally, considering the resource constraints of mobile devices, data integrity is verified by lightweight computing and low data transmission. The scheme improves shortage that mobile device communication and computing power are limited, it supports dynamic data operation in mobile multicloud environment, and data integrity can be verified without using direct source file block. Experimental results also demonstrate that this scheme can achieve a lower cost of computing and communications.

  19. Automation of integrated facilities. 3. conference on electrical engineering; Vernetzte Automation in der Gebaeudetechnik. Integration - Sicherheit - Wirtschaftlichkeit. 3. Fachtagung Elektrotechnik

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    Contents: Integration of technical systems in buildings using bus technology - the new guideline 6015; integrated automation and communication; roomautomation; integrations of technical solutions shown on the new building of the ''Investitionsbank des Landes Brandenburg'' and some examples for integration models and customer requirements.(GL)

  20. Integrating Numerical Computation into the Modeling Instruction Curriculum

    CERN Document Server

    Caballero, Marcos D; Aiken, John M; Douglas, Scott S; Scanlon, Erin M; Thoms, Brian; Schatz, Michael F

    2012-01-01

    We describe a way to introduce physics high school students with no background in programming to computational problem-solving experiences. Our approach builds on the great strides made by the Modeling Instruction reform curriculum. This approach emphasizes the practices of "Developing and using models" and "Computational thinking" highlighted by the NRC K-12 science standards framework. We taught 9th-grade students in a Modeling-Instruction-based physics course to construct computational models using the VPython programming environment. Numerical computation within the Modeling Instruction curriculum provides coherence among the curriculum's different force and motion models, links the various representations which the curriculum employs, and extends the curriculum to include real-world problems that are inaccessible to a purely analytic approach.

  1. Design of a Computer-Based Control System Using LabVIEW for the NEMESYS Electromagnetic Launcher Facility

    Science.gov (United States)

    2007-06-01

    quickly was necessary. A railgun shot typically occurs in less than 10 ms, and firing capacitor banks to shape the current pulse are in the 100s of...DESIGN OF A COMPUTER-BASED CONTROL SYSTEM USING LABVIEW FOR THE NEMESYS ELECTROMAGNETIC LAUNCHER FACILITY∗ B. M. Huhmanξ 1, J. M. Neri Plasma...has assembled a facility to develop and test materials for the study of barrel lifetime in electromagnetic launchers (EML) for surface-fire support

  2. Computational Analyses in Support of Sub-scale Diffuser Testing for the A-3 Facility. Part 1; Steady Predictions

    Science.gov (United States)

    Allgood, Daniel C.; Graham, Jason S.; Ahuja, Vineet; Hosangadi, Ashvin

    2010-01-01

    levels in CFD based flowpath modeling of the facility. The analyses tools used here expand on the multi-element unstructured CFD which has been tailored and validated for impingement dynamics of dry plumes, complex valve/feed systems, and high pressure propellant delivery systems used in engine and component test stands at NASA SSC. The analyses performed in the evaluation of the sub-scale diffuser facility explored several important factors that influence modeling and understanding of facility operation such as (a) importance of modeling the facility with Real Gas approximation, (b) approximating the cluster of steam ejector nozzles as a single annular nozzle, (c) existence of mixed subsonic/supersonic flow downstream of the turning duct, and (d) inadequacy of two-equation turbulence models in predicting the correct pressurization in the turning duct and expansion of the second stage steam ejectors. The procedure used for modeling the facility was as follows: (i) The engine, test cell and first stage ejectors were simulated with an axisymmetric approximation (ii) the turning duct, second stage ejectors and the piping downstream of the second stage ejectors were analyzed with a three-dimensional simulation utilizing a half-plane symmetry approximation. The solution i.e. primitive variables such as pressure, velocity components, temperature and turbulence quantities were passed from the first computational domain and specified as a supersonic boundary condition for the second simulation. (iii) The third domain comprised of the exit diffuser and the region in the vicinity of the facility (primary included to get the correct shock structure at the exit of the facility and entrainment characteristics). The first set of simulations comprising the engine, test cell and first stage ejectors was carried out both as a turbulent real gas calculation as well as a turbulent perfect gas calculation. A comparison for the two cases (Real Turbulent and Perfect gas turbulent) of the Ma

  3. Sensorimotor Integration in Speech Processing: Computational Basis and Neural Organization

    National Research Council Canada - National Science Library

    Hickok, Gregory; Houde, John; Rong, Feng

    2011-01-01

    .... We propose an integrative model of the speech-related "dorsal stream" in which sensorimotor interaction primarily supports speech production, in the form of a state feedback control architecture...

  4. Advanced computer algebra algorithms for the expansion of Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Ablinger, Jakob; Round, Mark; Schneider, Carsten [Johannes Kepler Univ., Linz (Austria). Research Inst. for Symbolic Computation; Bluemlein, Johannes [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany)

    2012-10-15

    Two-point Feynman parameter integrals, with at most one mass and containing local operator insertions in 4+{epsilon}-dimensional Minkowski space, can be transformed to multi-integrals or multi-sums over hyperexponential and/or hypergeometric functions depending on a discrete parameter n. Given such a specific representation, we utilize an enhanced version of the multivariate Almkvist-Zeilberger algorithm (for multi-integrals) and a common summation framework of the holonomic and difference field approach (for multi-sums) to calculate recurrence relations in n. Finally, solving the recurrence we can decide efficiently if the first coefficients of the Laurent series expansion of a given Feynman integral can be expressed in terms of indefinite nested sums and products; if yes, the all n solution is returned in compact representations, i.e., no algebraic relations exist among the occurring sums and products.

  5. Advanced Computer Algebra Algorithms for the Expansion of Feynman Integrals

    CERN Document Server

    Ablinger, J; Round, M; Schneider, C

    2012-01-01

    Two-point Feynman parameter integrals, with at most one mass and containing local operator insertions in $4+\\ep$-dimensional Minkowski space, can be transformed to multi-integrals or multi-sums over hyperexponential and/or hypergeometric functions depending on a discrete parameter $n$. Given such a specific representation, we utilize an enhanced version of the multivariate Almkvist--Zeilberger algorithm (for multi-integrals) and a common summation framework of the holonomic and difference field approach (for multi-sums) to calculate recurrence relations in $n$. Finally, solving the recurrence we can decide efficiently if the first coefficients of the Laurent series expansion of a given Feynman integral can be expressed in terms of indefinite nested sums and products; if yes, the all $n$ solution is returned in compact representations, i.e., no algebraic relations exist among the occurring sums and products.

  6. Integration of nanoscale memristor synapses in neuromorphic computing architectures

    Science.gov (United States)

    Indiveri, Giacomo; Linares-Barranco, Bernabé; Legenstein, Robert; Deligeorgis, George; Prodromakis, Themistoklis

    2013-09-01

    Conventional neuro-computing architectures and artificial neural networks have often been developed with no or loose connections to neuroscience. As a consequence, they have largely ignored key features of biological neural processing systems, such as their extremely low-power consumption features or their ability to carry out robust and efficient computation using massively parallel arrays of limited precision, highly variable, and unreliable components. Recent developments in nano-technologies are making available extremely compact and low power, but also variable and unreliable solid-state devices that can potentially extend the offerings of availing CMOS technologies. In particular, memristors are regarded as a promising solution for modeling key features of biological synapses due to their nanoscale dimensions, their capacity to store multiple bits of information per element and the low energy required to write distinct states. In this paper, we first review the neuro- and neuromorphic computing approaches that can best exploit the properties of memristor and scale devices, and then propose a novel hybrid memristor-CMOS neuromorphic circuit which represents a radical departure from conventional neuro-computing approaches, as it uses memristors to directly emulate the biophysics and temporal dynamics of real synapses. We point out the differences between the use of memristors in conventional neuro-computing architectures and the hybrid memristor-CMOS circuit proposed, and argue how this circuit represents an ideal building block for implementing brain-inspired probabilistic computing paradigms that are robust to variability and fault tolerant by design.

  7. Analysing the scalability of thermal hydraulic test facility data to reactor scale with a computer code; Vertailuanalyysin kaeyttoe termohydraulisten koelaitteistojen tulosten laitosmittakaavaan skaalautumisen tutkimisessa

    Energy Technology Data Exchange (ETDEWEB)

    Suikkanen, P.

    2009-01-15

    The objective of the Masters thesis was to study guidelines and procedures for scaling of thermal hydraulic test facilities and to compare results from two test facility models and from EPR model. Aim was to get an impression of how well the studied test facilities describe the behaviour in power plant scale during accident scenarios with computer codes. Models were used to determine the influence of primary circuit mass inventory on the behaviour of the circuit. The data from test facility models represent the same phenomena as the data from EPR model. The results calculated with PKL model were also compared against PKL test facility data. They showed good agreement. Test facility data is used to validate computer codes, which are used in nuclear safety analysis. The scale of the facility has effect on the behaviour of the phenomena and therefore special care must be taken in using the data. (orig.)

  8. EGI-EUDAT integration activity - Pair data and high-throughput computing resources together

    Science.gov (United States)

    Scardaci, Diego; Viljoen, Matthew; Vitlacil, Dejan; Fiameni, Giuseppe; Chen, Yin; sipos, Gergely; Ferrari, Tiziana

    2016-04-01

    relevant European Research infrastructure in the field of Earth Science (EPOS and ICOS), Bioinformatics (BBMRI and ELIXIR) and Space Physics (EISCAT-3D). The first outcome of this activity has been the definition of a generic use case that captures the typical user scenario with respect the integrated use of the EGI and EUDAT infrastructures. This generic use case allows a user to instantiate a set of Virtual Machine images on the EGI Federated Cloud to perform computational jobs that analyse data previously stored on EUDAT long-term storage systems. The results of such analysis can be staged back to EUDAT storages, and if needed, allocated with Permanent identifyers (PIDs) for future use. The implementation of this generic use case requires the following integration activities between EGI and EUDAT: (1) harmonisation of the user authentication and authorisation models, (2) implementing interface connectors between the relevant EGI and EUDAT services, particularly EGI Cloud compute facilities and EUDAT long-term storage and PID systems. In the presentation, the collected user requirements and the implementation status of the universal use case will be showed. Furthermore, how the universal use case is currently applied to satisfy EPOS and ICOS needs will be described.

  9. Integrating computers in physics teaching: An Indian perspective

    Science.gov (United States)

    Jolly, Pratibha

    1997-03-01

    The University of Delhi has around twenty affiliated undergraduate colleges that offer a three-year physics major program to nearly five hundred students. All follow a common curriculum and submit to a centralized examination. This structure of tertiary education makes it relatively difficult to implement radical or rapid changes in the formal curriculum. The technology onslaught has, at last, irrevocably altered this; computers are carving new windows in old citadels and defining the agenda in teaching-learning environments the world over. In 1992, we formally introduced Computational Physics as a core paper in the second year of the Bachelor's program. As yet, the emphasis is on imparting familiarity with computers, a programming language and rudiments of numerical algorithms. In a parallel development, we also introduced a strong component of instrumentation with modern day electronic devices, including microprocessors. Many of us, however, would like to see not just computer presence in our curriculum but a totally new curriculum and teaching strategy that exploits, befittingly, the new technology. The current challenge is to realize in practice the full potential of the computer as the proverbial versatile tool: interfacing laboratory experiments for real-time acquisition and control of data; enabling rigorous analysis and data modeling; simulating micro-worlds and real life phenomena; establishing new cognitive linkages between theory and empirical observation; and between abstract constructs and visual representations.

  10. Application of Framework for Integrating Safety, Security and Safeguards (3Ss) into the Design Of Used Nuclear Fuel Storage Facility

    Energy Technology Data Exchange (ETDEWEB)

    Badwan, Faris M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Demuth, Scott F [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-06

    Department of Energy’s Office of Nuclear Energy, Fuel Cycle Research and Development develops options to the current commercial fuel cycle management strategy to enable the safe, secure, economic, and sustainable expansion of nuclear energy while minimizing proliferation risks by conducting research and development focused on used nuclear fuel recycling and waste management to meet U.S. needs. Used nuclear fuel is currently stored onsite in either wet pools or in dry storage systems, with disposal envisioned in interim storage facility and, ultimately, in a deep-mined geologic repository. The safe management and disposition of used nuclear fuel and/or nuclear waste is a fundamental aspect of any nuclear fuel cycle. Integrating safety, security, and safeguards (3Ss) fully in the early stages of the design process for a new nuclear facility has the potential to effectively minimize safety, proliferation, and security risks. The 3Ss integration framework could become the new national and international norm and the standard process for designing future nuclear facilities. The purpose of this report is to develop a framework for integrating the safety, security and safeguards concept into the design of Used Nuclear Fuel Storage Facility (UNFSF). The primary focus is on integration of safeguards and security into the UNFSF based on the existing Nuclear Regulatory Commission (NRC) approach to addressing the safety/security interface (10 CFR 73.58 and Regulatory Guide 5.73) for nuclear power plants. The methodology used for adaptation of the NRC safety/security interface will be used as the basis for development of the safeguards /security interface and later will be used as the basis for development of safety and safeguards interface. Then this will complete the integration cycle of safety, security, and safeguards. The overall methodology for integration of 3Ss will be proposed, but only the integration of safeguards and security will be applied to the design of the

  11. The engineering design integration (EDIN) system. [digital computer program complex

    Science.gov (United States)

    Glatt, C. R.; Hirsch, G. N.; Alford, G. E.; Colquitt, W. N.; Reiners, S. J.

    1974-01-01

    A digital computer program complex for the evaluation of aerospace vehicle preliminary designs is described. The system consists of a Univac 1100 series computer and peripherals using the Exec 8 operating system, a set of demand access terminals of the alphanumeric and graphics types, and a library of independent computer programs. Modification of the partial run streams, data base maintenance and construction, and control of program sequencing are provided by a data manipulation program called the DLG processor. The executive control of library program execution is performed by the Univac Exec 8 operating system through a user established run stream. A combination of demand and batch operations is employed in the evaluation of preliminary designs. Applications accomplished with the EDIN system are described.

  12. An integrative computational modelling of music structure apprehension

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2014-01-01

    An objectivization of music analysis requires a detailed formalization of the underlying principles and methods. The formalization of the most elementary structural processes is hindered by the complexity of music, both in terms of profusions of entities (such as notes) and of tight interactions...... between a large number of dimensions. Computational modeling would enable systematic and exhaustive tests on sizeable pieces of music, yet current researches cover particular musical dimensions with limited success. The aim of this research is to conceive a computational modeling of music analysis...... encompassing the main core components and focusing on their interdependencies. The system should be as simple as possible, while producing relevant structural analyses on a large variety of music. This paper describes the general principles of a computational framework for music analysis currently under...

  13. Testing SLURM open source batch system for a Tierl/Tier2 HEP computing facility

    Science.gov (United States)

    Donvito, Giacinto; Salomoni, Davide; Italiano, Alessandro

    2014-06-01

    In this work the testing activities that were carried on to verify if the SLURM batch system could be used as the production batch system of a typical Tier1/Tier2 HEP computing center are shown. SLURM (Simple Linux Utility for Resource Management) is an Open Source batch system developed mainly by the Lawrence Livermore National Laboratory, SchedMD, Linux NetworX, Hewlett-Packard, and Groupe Bull. Testing was focused both on verifying the functionalities of the batch system and the performance that SLURM is able to offer. We first describe our initial set of requirements. Functionally, we started configuring SLURM so that it replicates all the scheduling policies already used in production in the computing centers involved in the test, i.e. INFN-Bari and the INFN-Tier1 at CNAF, Bologna. Currently, the INFN-Tier1 is using IBM LSF (Load Sharing Facility), while INFN-Bari, an LHC Tier2 for both CMS and Alice, is using Torque as resource manager and MAUI as scheduler. We show how we configured SLURM in order to enable several scheduling functionalities such as Hierarchical FairShare, Quality of Service, user-based and group-based priority, limits on the number of jobs per user/group/queue, job age scheduling, job size scheduling, and scheduling of consumable resources. We then show how different job typologies, like serial, MPI, multi-thread, whole-node and interactive jobs can be managed. Tests on the use of ACLs on queues or in general other resources are then described. A peculiar SLURM feature we also verified is triggers on event, useful to configure specific actions on each possible event in the batch system. We also tested highly available configurations for the master node. This feature is of paramount importance since a mandatory requirement in our scenarios is to have a working farm cluster even in case of hardware failure of the server(s) hosting the batch system. Among our requirements there is also the possibility to deal with pre-execution and post

  14. Integrating Computer-Mediated Communication into an EAP course

    Institute of Scientific and Technical Information of China (English)

    Li Xiao; Cao Ru-hua

    2006-01-01

    The development of the computer along with the widespread use of the Internet has rapidly promoted Computer-Mediated Communication (CMC) as a very important communication media,which can be used widely and effectively in foreign language teaching and learning.This essay tries to explore the advantages of CMC as well as its proposed application,beginning with the introduction of some concepts related to CMC.From the research history,the rationale of using CMC in foreign language learning is summarised.The context of an EAP course is introduced and some suggestions on using CMC in this course are proposed.

  15. Integrating Free Computer Software in Chemistry and Biochemistry Instruction: An International Collaboration

    Science.gov (United States)

    Cedeno, David L.; Jones, Marjorie A.; Friesen, Jon A.; Wirtz, Mark W.; Rios, Luz Amalia; Ocampo, Gonzalo Taborda

    2010-01-01

    At the Universidad de Caldas, Manizales, Colombia, we used their new computer facilities to introduce chemistry graduate students to biochemical database mining and quantum chemistry calculations using freeware. These hands-on workshops allowed the students a strong introduction to easily accessible software and how to use this software to begin…

  16. Computing periodic orbits using the anti-integrable limit

    CERN Document Server

    Sterling, D G

    1998-01-01

    Chaotic dynamics can be effectively studied by continuation from an anti-integrable limit. Using the Henon map as an example, we obtain a simple analytical bound on the domain of existence of the horseshoe that is equivalent to the well-known bound of Devaney and Nitecki. We also reformulate the popular method for finding periodic orbits introduced by Biham and Wenzel. Near an anti-integrable limit, we show that this method is guaranteed to converge. This formulation puts the choice of symbolic dynamics, required for the algorithm, on a firm foundation.

  17. Model of Integration of Material Flow Control System with MES/ERP System via Cloud Computing

    National Research Council Canada - National Science Library

    Peter Peniak

    2014-01-01

    This article deals with a model of application gateway for integration of Material Flow Control System with ERP/MES systems, which are provided by Cloud Computing and Software as Service delivery model...

  18. Direction selectivity is computed by active dendritic integration in retinal ganglion cells.

    Science.gov (United States)

    Sivyer, Benjamin; Williams, Stephen R

    2013-12-01

    Active dendritic integration is thought to enrich the computational power of central neurons. However, a direct role of active dendritic processing in the execution of defined neuronal computations in intact neural networks has not been established. Here we used multi-site electrophysiological recording techniques to demonstrate that active dendritic integration underlies the computation of direction selectivity in rabbit retinal ganglion cells. Direction-selective retinal ganglion cells fire action potentials in response to visual image movement in a preferred direction. Dendritic recordings revealed that preferred-direction moving-light stimuli led to dendritic spike generation in terminal dendrites, which were further integrated and amplified as they spread through the dendritic arbor to the axon to drive action potential output. In contrast, when light bars moved in a null direction, synaptic inhibition vetoed neuronal output by directly inhibiting terminal dendritic spike initiation. Active dendritic integration therefore underlies a physiologically engaged circuit-based computation in the retina.

  19. High Performance Computing tools for the Integrated Tokamak Modelling project

    Energy Technology Data Exchange (ETDEWEB)

    Guillerminet, B., E-mail: bernard.guillerminet@cea.f [Association Euratom-CEA sur la Fusion, IRFM, DSM, CEA Cadarache (France); Plasencia, I. Campos [Instituto de Fisica de Cantabria (IFCA), CSIC, Santander (Spain); Haefele, M. [Universite Louis Pasteur, Strasbourg (France); Iannone, F. [EURATOM/ENEA Fusion Association, Frascati (Italy); Jackson, A. [University of Edinburgh (EPCC) (United Kingdom); Manduchi, G. [EURATOM/ENEA Fusion Association, Padova (Italy); Plociennik, M. [Poznan Supercomputing and Networking Center (PSNC) (Poland); Sonnendrucker, E. [Universite Louis Pasteur, Strasbourg (France); Strand, P. [Chalmers University of Technology (Sweden); Owsiak, M. [Poznan Supercomputing and Networking Center (PSNC) (Poland)

    2010-07-15

    Fusion Modelling and Simulation are very challenging and the High Performance Computing issues are addressed here. Toolset for jobs launching and scheduling, data communication and visualization have been developed by the EUFORIA project and used with a plasma edge simulation code.

  20. Integrating Computer-Assisted Translation Tools into Language Learning

    Science.gov (United States)

    Fernández-Parra, María

    2016-01-01

    Although Computer-Assisted Translation (CAT) tools play an important role in the curriculum in many university translator training programmes, they are seldom used in the context of learning a language, as a good command of a language is needed before starting to translate. Since many institutions often have translator-training programmes as well…

  1. Multi-Purpose Thermal Hydraulic Loop: Advanced Reactor Technology Integral System Test (ARTIST) Facility for Support of Advanced Reactor Technologies

    Energy Technology Data Exchange (ETDEWEB)

    James E. O' Brien; Piyush Sabharwall; SuJong Yoon

    2001-11-01

    Effective and robust high temperature heat transfer systems are fundamental to the successful deployment of advanced reactors for both power generation and non-electric applications. Plant designs often include an intermediate heat transfer loop (IHTL) with heat exchangers at either end to deliver thermal energy to the application while providing isolation of the primary reactor system. In order to address technical feasibility concerns and challenges a new high-temperature multi-fluid, multi-loop test facility “Advanced Reactor Technology Integral System Test facility” (ARTIST) is under development at the Idaho National Laboratory. The facility will include three flow loops: high-temperature helium, molten salt, and steam/water. Details of some of the design aspects and challenges of this facility, which is currently in the conceptual design phase, are discussed

  2. Higher-Order Integral Equation Methods in Computational Electromagnetics

    DEFF Research Database (Denmark)

    Jørgensen, Erik; Meincke, Peter

    Higher-order integral equation methods have been investigated. The study has focused on improving the accuracy and efficiency of the Method of Moments (MoM) applied to electromagnetic problems. A new set of hierarchical Legendre basis functions of arbitrary order is developed. The new basis...

  3. Insights from the 3rd World Congress on Integrated Computational Materials Engineering

    Science.gov (United States)

    Howe, D.; Goodlet, B.; Weaver, J.; Spanos, G.

    2016-05-01

    The 3rd World Congress on Integrated Computational Materials Engineering (ICME) was a forum for presenting the "state-of-the-art" in the ICME discipline, as well as for charting a path for future community efforts. The event concluded with in an interactive panel-led discussion that addressed such topics as integrating efforts between experimental and computational scientists, uncertainty quantification, and identifying the greatest challenges for future workforce preparation. This article is a summary of this discussion and the thoughts presented.

  4. Patterns of Beliefs, Attitudes, and Characteristics of Teachers That Influence Computer Integration

    Directory of Open Access Journals (Sweden)

    Julie Mueller

    2012-01-01

    Full Text Available Despite continued acceleration of computer access in elementary and secondary schools, computer integration is not necessarily given as an everyday learning tool. A heterogeneous sample of 185 elementary and 204 secondary teachers was asked to respond to open-ended survey questions in order to understand why integration of computer-based technologies does or does not fit with their teaching philosophy, what factors impact planning to use computer technologies in the classroom, and what characteristics define excellent teachers who integrate technology. Qualitative analysis of open-ended questions indicated that, overall, educators are supportive of computer integration describing the potential of technology using constructivist language, such as “authentic tasks” and “self-regulated learning.” Responses from “high” and “low” integrating teachers were compared across themes. The diversity of the themes and the emerging patterns of those themes from “high and low integrators” indicate that the integration of computer technology is a complex concern that requires sensitivity to individual and contextual variables.

  5. DNA-enabled integrated molecular systems for computation and sensing.

    Science.gov (United States)

    LaBoda, Craig; Duschl, Heather; Dwyer, Chris L

    2014-06-17

    CONSPECTUS: Nucleic acids have become powerful building blocks for creating supramolecular nanostructures with a variety of new and interesting behaviors. The predictable and guided folding of DNA, inspired by nature, allows designs to manipulate molecular-scale processes unlike any other material system. Thus, DNA can be co-opted for engineered and purposeful ends. This Account details a small portion of what can be engineered using DNA within the context of computer architectures and systems. Over a decade of work at the intersection of DNA nanotechnology and computer system design has shown several key elements and properties of how to harness the massive parallelism created by DNA self-assembly. This work is presented, naturally, from the bottom-up beginning with early work on strand sequence design for deterministic, finite DNA nanostructure synthesis. The key features of DNA nanostructures are explored, including how the use of small DNA motifs assembled in a hierarchical manner enables full-addressability of the final nanostructure, an important property for building dense and complicated systems. A full computer system also requires devices that are compatible with DNA self-assembly and cooperate at a higher level as circuits patterned over many, many replicated units. Described here is some work in this area investigating nanowire and nanoparticle devices, as well as chromophore-based circuits called resonance energy transfer (RET) logic. The former is an example of a new way to bring traditional silicon transistor technology to the nanoscale, which is increasingly problematic with current fabrication methods. RET logic, on the other hand, introduces a framework for optical computing at the molecular level. This Account also highlights several architectural system studies that demonstrate that even with low-level devices that are inferior to their silicon counterparts and a substrate that harbors abundant defects, self-assembled systems can still

  6. Integration of Chiropractic Services in Military and Veteran Health Care Facilities: A Systematic Review of the Literature.

    Science.gov (United States)

    Green, Bart N; Johnson, Claire D; Daniels, Clinton J; Napuli, Jason G; Gliedt, Jordan A; Paris, David J

    2016-04-01

    This literature review examined studies that described practice, utilization, and policy of chiropractic services within military and veteran health care environments. A systematic search of Medline, CINAHL, and Index to Chiropractic Literature was performed from inception through April 2015. Thirty articles met inclusion criteria. Studies reporting utilization and policy show that chiropractic services are successfully implemented in various military and veteran health care settings and that integration varies by facility. Doctors of chiropractic that are integrated within military and veteran health care facilities manage common neurological, musculoskeletal, and other conditions; severe injuries obtained in combat; complex cases; and cases that include psychosocial factors. Chiropractors collaboratively manage patients with other providers and focus on reducing morbidity for veterans and rehabilitating military service members to full duty status. Patient satisfaction with chiropractic services is high. Preliminary findings show that chiropractic management of common conditions shows significant improvement.

  7. Solving a mathematical model integrating unequal-area facilities layout and part scheduling in a cellular manufacturing system by a genetic algorithm.

    Science.gov (United States)

    Ebrahimi, Ahmad; Kia, Reza; Komijan, Alireza Rashidi

    2016-01-01

    In this article, a novel integrated mixed-integer nonlinear programming model is presented for designing a cellular manufacturing system (CMS) considering machine layout and part scheduling problems simultaneously as interrelated decisions. The integrated CMS model is formulated to incorporate several design features including part due date, material handling time, operation sequence, processing time, an intra-cell layout of unequal-area facilities, and part scheduling. The objective function is to minimize makespan, tardiness penalties, and material handling costs of inter-cell and intra-cell movements. Two numerical examples are solved by the Lingo software to illustrate the results obtained by the incorporated features. In order to assess the effects and importance of integration of machine layout and part scheduling in designing a CMS, two approaches, sequentially and concurrent are investigated and the improvement resulted from a concurrent approach is revealed. Also, due to the NP-hardness of the integrated model, an efficient genetic algorithm is designed. As a consequence, computational results of this study indicate that the best solutions found by GA are better than the solutions found by B&B in much less time for both sequential and concurrent approaches. Moreover, the comparisons between the objective function values (OFVs) obtained by sequential and concurrent approaches demonstrate that the OFV improvement is averagely around 17 % by GA and 14 % by B&B.

  8. The explicit computation of integration algorithms and first integrals for ordinary differential equations with polynomials coefficients using trees

    Science.gov (United States)

    Crouch, P. E.; Grossman, Robert

    1992-01-01

    This note is concerned with the explicit symbolic computation of expressions involving differential operators and their actions on functions. The derivation of specialized numerical algorithms, the explicit symbolic computation of integrals of motion, and the explicit computation of normal forms for nonlinear systems all require such computations. More precisely, if R = k(x(sub 1),...,x(sub N)), where k = R or C, F denotes a differential operator with coefficients from R, and g member of R, we describe data structures and algorithms for efficiently computing g. The basic idea is to impose a multiplicative structure on the vector space with basis the set of finite rooted trees and whose nodes are labeled with the coefficients of the differential operators. Cancellations of two trees with r + 1 nodes translates into cancellation of O(N(exp r)) expressions involving the coefficient functions and their derivatives.

  9. Preliminary experimental results using the thermal-hydraulic integral test facility (VISTA) for the pilot plant of the system integrated modular advanced reactor, SMART-P

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Ki Yong; Pak, Hyun Sik; Cho, Seok; Pak, Choon Kyung; Lee, Sung Jae; Song, Chul Hwa; Chung, Moon Ki [KAERI, Taejon (Korea, Republic of)

    2003-07-01

    Preliminary experimental tests were carried out using the thermal-hydraulic integral test facility, VISTA (Experimental Verification by Integral Simulation of Transients and Accidents), which has been constructed to simulate the SMART-P. The VISTA facility is an integral test facility including the primary and secondary systems as well as safety-related Passive Residual heat removal (PRHR) systems. Its scaled ratio with respect to the SMART-P is 1/1 in height and 1/96 in volume and heater power. So far, several steady states and transient tests have been carried out to verify the overall thermal hydraulic primary and secondary characteristics in a range of 10% to 100% power operation. As results of preliminary results, the steady state conditions were found to coincide with the expected design values of the SMART-P. But the major thermal hydraulic parameters are greatly affected by the initial water level and the nitrogen pressure in the reactor upper annular cavity. In the PRHR transient tests, the steam inlet temperature of the PRHR system is found to drop suddenly from a superheated condition to a saturated condition at the end period of PRHR operation.

  10. Landau Theory of Adaptive Integration in Computational Intelligence

    CERN Document Server

    Plewczynski, Dariusz

    2010-01-01

    Computational Intelligence (CI) is a sub-branch of Artificial Intelligence paradigm focusing on the study of adaptive mechanisms to enable or facilitate intelligent behavior in complex and changing environments. There are several paradigms of CI [like artificial neural networks, evolutionary computations, swarm intelligence, artificial immune systems, fuzzy systems and many others], each of these has its origins in biological systems [biological neural systems, natural Darwinian evolution, social behavior, immune system, interactions of organisms with their environment]. Most of those paradigms evolved into separate machine learning (ML) techniques, where probabilistic methods are used complementary with CI techniques in order to effectively combine elements of learning, adaptation, evolution and Fuzzy logic to create heuristic algorithms that are, in some sense, intelligent. The current trend is to develop consensus techniques, since no single machine learning algorithms is superior to others in all possible...

  11. An Integrated Review of Emoticons in Computer-Mediated Communication

    OpenAIRE

    Aldunate, Nerea; González-Ibáñez, Roberto

    2017-01-01

    Facial expressions constitute a rich source of non-verbal cues in face-to-face communication. They provide interlocutors with resources to express and interpret verbal messages, which may affect their cognitive and emotional processing. Contrarily, computer-mediated communication (CMC), particularly text-based communication, is limited to the use of symbols to convey a message, where facial expressions cannot be transmitted naturally. In this scenario, people use emoticons as paralinguistic c...

  12. Data integrity by validation of a computer based laboratory system

    OpenAIRE

    Jasmin, Ramić

    2017-01-01

    The thesis deals with the assurance of regulatory compliance and validation of computer supported laboratory systems in the pharmaceutical industry. It describes the functioning and importance of regulatory authorities and explains the standards and good practice examples to be observed in the validation process. By actively introducing the act on electronic records and signatures, food and drug agencies have succeeded in setting up clear requirements and guidelines in the pharmaceutical indu...

  13. Use of Integrated Computational Approaches in the Search for New Therapeutic Agents.

    Science.gov (United States)

    Persico, Marco; Di Dato, Antonio; Orteca, Nausicaa; Cimino, Paola; Novellino, Ettore; Fattorusso, Caterina

    2016-09-01

    Computer-aided drug discovery plays a strategic role in the development of new potential therapeutic agents. Nevertheless, the modeling of biological systems still represents a challenge for computational chemists and at present a single computational method able to face such challenge is not available. This prompted us, as computational medicinal chemists, to develop in-house methodologies by mixing various bioinformatics and computational tools. Importantly, thanks to multi-disciplinary collaborations, our computational studies were integrated and validated by experimental data in an iterative process. In this review, we describe some recent applications of such integrated approaches and how they were successfully applied in i) the search of new allosteric inhibitors of protein-protein interactions and ii) the development of new redox-active antimalarials from natural leads.

  14. Implicit Unstructured Computational Aerodynamics on Many-Integrated Core Architecture

    KAUST Repository

    Al Farhan, Mohammed A.

    2014-05-04

    This research aims to understand the performance of PETSc-FUN3D, a fully nonlinear implicit unstructured grid incompressible or compressible Euler code with origins at NASA and the U.S. DOE, on many-integrated core architecture and how a hybridprogramming paradigm (MPI+OpenMP) can exploit Intel Xeon Phi hardware with upwards of 60 cores per node and 4 threads per core. For the current contribution, we focus on strong scaling with many-integrated core hardware. In most implicit PDE-based codes, while the linear algebraic kernel is limited by the bottleneck of memory bandwidth, the flux kernel arising in control volume discretization of the conservation law residuals and the preconditioner for the Jacobian exploits the Phi hardware well.

  15. Integration-by-parts reductions from the viewpoint of computational algebraic geometry

    CERN Document Server

    Larsen, Kasper J

    2016-01-01

    Integration-by-parts reductions play a central role in perturbative QFT calculations. They allow the set of Feynman integrals contributing to a given observable to be reduced to a small set of basis integrals, and they moreover facilitate the computation of those basis integrals. We introduce an efficient new method for generating integration-by-parts reductions. This method simplifies the task by making use of generalized-unitarity cuts and turns the problem of finding the needed total derivatives into one of solving certain polynomial (so-called syzygy) equations.

  16. Measurement of integral diffraction coefficients of crystals on beamline 4B7of Beijing Synchrotron Radiation Facility

    Institute of Scientific and Technical Information of China (English)

    Yang Jia-Min; Hu Zhi-Min; Wei Min-Xi; Zhang Ji-Yan; Yi Rong-Qing; Gan Xin-Shi; Zhao Yang; Cui Ming-Qi; Zhu Tuo; Zhao Yi-Dong; Sun Li-Juan; Zheng Lei; Yan Fen

    2011-01-01

    Integral diffraction coefficients of the crystal are the essential data of a crystal spectrometer which is extensively used to measure quantitative x-ray spectra of high temperature plasmas in kilo-electron-volt region. An experimental method has been developed to measure the integral diffraction coefficients of crystals on beamline 4B7 of Beijing Synchrotron Radiation Facility. The integral diffraction coefficients of several crystals including polyethylene terephthalate (PET), thallium acid phthalate (TIAP) and rubidium acid phthalate (RAP) crystals have been measured in the x-ray energy range 2100-5600 eV and compared with the calculations of the 'Darwin Prins' and the 'Mosaic' models. It is shown that the integral diffraction coefficients of these crystals are between the calculations of the 'Darwin Prins' and the 'Mosaic' models, but more close to the 'Darwin Prins' model calculations.

  17. Integrating Reconfigurable Hardware-Based Grid for High Performance Computing

    Directory of Open Access Journals (Sweden)

    Julio Dondo Gazzano

    2015-01-01

    Full Text Available FPGAs have shown several characteristics that make them very attractive for high performance computing (HPC. The impressive speed-up factors that they are able to achieve, the reduced power consumption, and the easiness and flexibility of the design process with fast iterations between consecutive versions are examples of benefits obtained with their use. However, there are still some difficulties when using reconfigurable platforms as accelerator that need to be addressed: the need of an in-depth application study to identify potential acceleration, the lack of tools for the deployment of computational problems in distributed hardware platforms, and the low portability of components, among others. This work proposes a complete grid infrastructure for distributed high performance computing based on dynamically reconfigurable FPGAs. Besides, a set of services designed to facilitate the application deployment is described. An example application and a comparison with other hardware and software implementations are shown. Experimental results show that the proposed architecture offers encouraging advantages for deployment of high performance distributed applications simplifying development process.

  18. Innovation and Integration: Case Studies of Effective Teacher Practices in the Use of Handheld Computers

    Science.gov (United States)

    Chavez, Raymond Anthony

    2010-01-01

    Previous research conducted on the use of handheld computers in K-12 education has focused on how handheld computer use affects student motivation, engagement, and productivity. These four case studies sought to identify effective teacher practices in the integration of handhelds into the curriculum and the factors that affect those practices. The…

  19. Integrating Computer-Assisted Language Learning in Saudi Schools: A Change Model

    Science.gov (United States)

    Alresheed, Saleh; Leask, Marilyn; Raiker, Andrea

    2015-01-01

    Computer-assisted language learning (CALL) technology and pedagogy have gained recognition globally for their success in supporting second language acquisition (SLA). In Saudi Arabia, the government aims to provide most educational institutions with computers and networking for integrating CALL into classrooms. However, the recognition of CALL's…

  20. INTERDISCIPLINARY INTEGRATION IN THE COURSE OF STUDYING WEB TECHNOLOGIES AND COMPUTER GRAPHICS

    Directory of Open Access Journals (Sweden)

    Andrey V. Kolesnikov

    2013-01-01

    Full Text Available The expediency of interdisciplinary integration in large-scale introduction of the competence approach is substantiated. The methodology for interdisciplinary integration is presented in the article. The authors demonstrate this methodology on the example of two disciplines: «The development and maintenance of information web-resources» and «Computer Graphics and Multimedia.»

  1. An Exploration of Three-Dimensional Integrated Assessment for Computational Thinking

    Science.gov (United States)

    Zhong, Baichang; Wang, Qiyun; Chen, Jie; Li, Yi

    2016-01-01

    Computational thinking (CT) is a fundamental skill for students, and assessment is a critical factor in education. However, there is a lack of effective approaches to CT assessment. Therefore, we designed the Three-Dimensional Integrated Assessment (TDIA) framework in this article. The TDIA has two aims: one was to integrate three dimensions…

  2. An integrative computational approach for prioritization of genomic variants.

    Directory of Open Access Journals (Sweden)

    Inna Dubchak

    Full Text Available An essential step in the discovery of molecular mechanisms contributing to disease phenotypes and efficient experimental planning is the development of weighted hypotheses that estimate the functional effects of sequence variants discovered by high-throughput genomics. With the increasing specialization of the bioinformatics resources, creating analytical workflows that seamlessly integrate data and bioinformatics tools developed by multiple groups becomes inevitable. Here we present a case study of a use of the distributed analytical environment integrating four complementary specialized resources, namely the Lynx platform, VISTA RViewer, the Developmental Brain Disorders Database (DBDB, and the RaptorX server, for the identification of high-confidence candidate genes contributing to pathogenesis of spina bifida. The analysis resulted in prediction and validation of deleterious mutations in the SLC19A placental transporter in mothers of the affected children that causes narrowing of the outlet channel and therefore leads to the reduced folate permeation rate. The described approach also enabled correct identification of several genes, previously shown to contribute to pathogenesis of spina bifida, and suggestion of additional genes for experimental validations. The study demonstrates that the seamless integration of bioinformatics resources enables fast and efficient prioritization and characterization of genomic factors and molecular networks contributing to the phenotypes of interest.

  3. An Integrative Computational Approach for Prioritization of Genomic Variants

    Science.gov (United States)

    Wang, Sheng; Meyden, Cem; Sulakhe, Dinanath; Poliakov, Alexander; Börnigen, Daniela; Xie, Bingqing; Taylor, Andrew; Ma, Jianzhu; Paciorkowski, Alex R.; Mirzaa, Ghayda M.; Dave, Paul; Agam, Gady; Xu, Jinbo; Al-Gazali, Lihadh; Mason, Christopher E.; Ross, M. Elizabeth; Maltsev, Natalia; Gilliam, T. Conrad

    2014-01-01

    An essential step in the discovery of molecular mechanisms contributing to disease phenotypes and efficient experimental planning is the development of weighted hypotheses that estimate the functional effects of sequence variants discovered by high-throughput genomics. With the increasing specialization of the bioinformatics resources, creating analytical workflows that seamlessly integrate data and bioinformatics tools developed by multiple groups becomes inevitable. Here we present a case study of a use of the distributed analytical environment integrating four complementary specialized resources, namely the Lynx platform, VISTA RViewer, the Developmental Brain Disorders Database (DBDB), and the RaptorX server, for the identification of high-confidence candidate genes contributing to pathogenesis of spina bifida. The analysis resulted in prediction and validation of deleterious mutations in the SLC19A placental transporter in mothers of the affected children that causes narrowing of the outlet channel and therefore leads to the reduced folate permeation rate. The described approach also enabled correct identification of several genes, previously shown to contribute to pathogenesis of spina bifida, and suggestion of additional genes for experimental validations. The study demonstrates that the seamless integration of bioinformatics resources enables fast and efficient prioritization and characterization of genomic factors and molecular networks contributing to the phenotypes of interest. PMID:25506935

  4. Accurate computation of Galerkin double surface integrals in the 3-D boundary element method

    CERN Document Server

    Adelman, Ross; Duraiswami, Ramani

    2015-01-01

    Many boundary element integral equation kernels are based on the Green's functions of the Laplace and Helmholtz equations in three dimensions. These include, for example, the Laplace, Helmholtz, elasticity, Stokes, and Maxwell's equations. Integral equation formulations lead to more compact, but dense linear systems. These dense systems are often solved iteratively via Krylov subspace methods, which may be accelerated via the fast multipole method. There are advantages to Galerkin formulations for such integral equations, as they treat problems associated with kernel singularity, and lead to symmetric and better conditioned matrices. However, the Galerkin method requires each entry in the system matrix to be created via the computation of a double surface integral over one or more pairs of triangles. There are a number of semi-analytical methods to treat these integrals, which all have some issues, and are discussed in this paper. We present novel methods to compute all the integrals that arise in Galerkin fo...

  5. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  6. An Approach to Integrate a Space-Time GIS Data Model with High Performance Computers

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Dali [ORNL; Zhao, Ziliang [University of Tennessee, Knoxville (UTK); Shaw, Shih-Lung [ORNL

    2011-01-01

    In this paper, we describe an approach to integrate a Space-Time GIS data model on a high performance computing platform. The Space-Time GIS data model has been developed on a desktop computing environment. We use the Space-Time GIS data model to generate GIS module, which organizes a series of remote sensing data. We are in the process of porting the GIS module into an HPC environment, in which the GIS modules handle large dataset directly via parallel file system. Although it is an ongoing project, authors hope this effort can inspire further discussions on the integration of GIS on high performance computing platforms.

  7. Fuzzy model of the computer integrated decision support and management system in mineral processing

    Directory of Open Access Journals (Sweden)

    Miljanović Igor

    2008-01-01

    Full Text Available During the research on the subject of computer integrated systems for decision making and management support in mineral processing based on fuzzy logic, realized at the Department of Applied Computing and System Engineering of the Faculty of Mining and Geology, University of Belgrade, for the needs of doctoral thesis of the first author, and wider demands of the mineral industry, the incompleteness of the developed and contemporary computer integrated systems fuzzy models was noticed. The paper presents an original model with the seven staged hierarchical monitoring-management structure, in which the shortcomings of the models utilized today were eliminated.

  8. Continuous Integration for Concurrent Computational Framework and Application Development

    Directory of Open Access Journals (Sweden)

    Derek R Gaston

    2014-07-01

    Full Text Available Development of scientific software relies on specialized knowledge from a broad range of diverse disciplines including computer science, mathematics, engineering, and the natural sciences.  Since it is rare for a given practitioner to simultaneously be an expert in each of the aforementioned fields, teamwork and collaboration are now the norm for scientific software development.  This short paper discusses specific software development conventions that have led to the success of the MOOSE multiphysics framework at Idaho National Laboratory (INL, and ongoing plans to bring MOOSE to a wider community of developers as an open source project on GitHub.

  9. A computational- And storage-cloud for integration of biodiversity collections

    Science.gov (United States)

    Matsunaga, A.; Thompson, A.; Figueiredo, R. J.; Germain-Aubrey, C.C; Collins, M.; Beeman, R.S; Macfadden, B.J.; Riccardi, G.; Soltis, P.S; Page, L. M.; Fortes, J.A.B

    2013-01-01

    A core mission of the Integrated Digitized Biocollections (iDigBio) project is the building and deployment of a cloud computing environment customized to support the digitization workflow and integration of data from all U.S. nonfederal biocollections. iDigBio chose to use cloud computing technologies to deliver a cyberinfrastructure that is flexible, agile, resilient, and scalable to meet the needs of the biodiversity community. In this context, this paper describes the integration of open source cloud middleware, applications, and third party services using standard formats, protocols, and services. In addition, this paper demonstrates the value of the digitized information from collections in a broader scenario involving multiple disciplines.

  10. Computing Systems Configuration for Highly Integrated Guidance and Control Systems

    Science.gov (United States)

    1988-06-01

    interactives (notamment de tableau de bord d’aeronef pour 1’entrainement des equipages); du logiciel LANST concu pour la simulation ae reseau locaux; des...performances generales, par les nouveaux systimes integres de guidage et de pilotage ; d’avionique et de tir d’armes, ainsi que par lea systimes de... Pilotage . Accession For NTIS OFA&I J, t I f I ca, i r Ditrilutiozi/ K AvA11,billty Codes jAvnil and/or s~t I5spe-ila _ i LIST OF AUTHORS/SPEAKERS

  11. Methods for using computer training facilities in studies of special disciplines

    Directory of Open Access Journals (Sweden)

    O.L. Tashlykov

    2016-12-01

    The use of the analytical simulator is illustrated by a laboratory research project entitled “BN-800 Reactor Power Maneuvering”, which investigates the reactor facility power control modes in a power range of 100–80–100% of the rated power.

  12. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    Energy Technology Data Exchange (ETDEWEB)

    Kostin, Mikhail [FRIB, MSU; Mokhov, Nikolai [FNAL; Niita, Koji [RIST, Japan

    2013-09-25

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA and MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.

  13. Straighttalk. The ideal master facility plan begins with business strategy and integrates operational improvement.

    Science.gov (United States)

    Powder, Scott; Brown, Richard E; Haupert, John M; Smith, Ryder

    2007-04-02

    Given the scarcity of capital to meet ever-growing demands for healthcare services, master facility planning has become more important than ever. Executives must align their master facility plans with their overall business strategy, incorporating the best in care- and service-delivery models. In this installment of Straight Talk, executives from two health systems--Advocate Health Care in Oak Brook, Ill. and Parkland Health & Hospital System in Dallas--discuss master facility planning. Modern Healthcare and PricewaterhouseCoopers present Straight Talk. The session on master facility planning was held on March 8, 2007 at Modern Healthcare's Chicago Headquarters. Charles Lauer, former vice president of publishing and editorial director at Modern Healthcare, was the moderator.

  14. Integrated STEM in Elementary Grades Using Distributed Agent-based Computation

    CERN Document Server

    Sengupta, Pratim; Wright, Mason

    2014-01-01

    We investigate how the integration of visual agent-based programming and computationally augmented physical structures can support curricular integration across STEM domains for elementary grade students. We introduce ViMAP-Tangible, a socio-technically distributed computational learning environment, which integrates ultrasonic sensors with the ViMAP visual programming language using a distributed computation infrastructure. In this paper, we report a study in which 3rd and 4th grade students used ViMAP-Tangible to engage in collaborative design-based activities in order to invent 'drawing machines' for generating geometric shapes. The curricular activities integrate engineering practices such as user-centered design, mathematical reasoning about multiplication, rates and fractions, and physical science concepts central to learning Newtonian mechanics. We identify the key affordances of the learning environment and our pedagogical approach in terms of the relationship between the structural elements of studen...

  15. Integration of thermoelectrics and photovoltaics as auxiliary power sources in mobile computing applications

    Energy Technology Data Exchange (ETDEWEB)

    Muhtaroglu, Ali; von Jouanne, Annette [School of Electrical Engineering and Computer Science, Oregon State University, Corvallis, OR 97331-5501 (United States); Yokochi, Alex [School of Chemical, Biological and Environmental Engineering, Oregon State University, Corvallis, OR 97331-2702 (United States)

    2008-02-15

    The inclusion of renewable technologies as auxiliary power sources in mobile computing platforms can lead to improved performance such as the extension of battery life. This paper presents sustainable power management characteristics and performance enhancement opportunities in mobile computing systems resulting from the integration of thermoelectric generators and photovoltaic units. Thermoelectric generators are employed for scavenging waste heat from processors or other significant components in the computer's chipset while the integration of photovoltaic units is demonstrated for generating power from environmental illumination. A scalable and flexible power architecture is also verified to effectively integrate these renewable energy sources. This paper confirms that battery life extension can be achieved through the appropriate integration of renewable sources such as thermoelectric and photovoltaic devices. (author)

  16. Integrating aerodynamic surface modeling for computational fluid dynamics with computer aided structural analysis, design, and manufacturing

    Science.gov (United States)

    Thorp, Scott A.

    1992-01-01

    This presentation will discuss the development of a NASA Geometry Exchange Specification for transferring aerodynamic surface geometry between LeRC systems and grid generation software used for computational fluid dynamics research. The proposed specification is based on a subset of the Initial Graphics Exchange Specification (IGES). The presentation will include discussion of how the NASA-IGES standard will accommodate improved computer aided design inspection methods and reverse engineering techniques currently being developed. The presentation is in viewgraph format.

  17. Final Report. DOE Computational Nanoscience Project DE-FG02-03ER46096: Integrated Multiscale Modeling of Molecular Computing Devices

    Energy Technology Data Exchange (ETDEWEB)

    Cummings, Peter [Vanderbilt University

    2009-11-15

    The document is the final report of the DOE Computational Nanoscience Project DE-FG02-03ER46096: Integrated Multiscale Modeling of Molecular Computing Devices. It included references to 62 publications that were supported by the grant.

  18. Integrating ethical topics in a traditional computer science course

    Energy Technology Data Exchange (ETDEWEB)

    Winrich, L.B. [Univ. of North Dakota, Grand Forks, ND (United States)

    1994-12-31

    It is never hard to find additional, often unconventional, topics which seem to beg inclusion in standard courses. A dynamic discipline like computer science usually provides a steady stream of new technical ideas to vie for time and attention with more traditional material. As difficult as it may be to keep standard CS courses up-to-date with technical innovations, it often seems even more difficult to include non-technical topics even when there is universal agreement on their importance, Inevitably the question of whether or not such inclusion will compromise the technical content of the course arises. This paper describes an attempt to include two such topics in a traditional course in data structures. The two topics are writing and ethics and, although the effort concentrates on the inclusion of ethical questions in a standard CS course, writing is the vehicle for accomplishing this goal. Furthermore, the inclusion writing in the CS curriculum is certainly recognized as a desirable outcome.

  19. Decision trees and integrated features for computer aided mammographic screening

    Energy Technology Data Exchange (ETDEWEB)

    Kegelmeyer, W.P. Jr.; Groshong, B.; Allmen, M.; Woods, K.

    1997-02-01

    Breast cancer is a serious problem, which in the United States causes 43,000 deaths a year, eventually striking 1 in 9 women. Early detection is the only effective countermeasure, and mass mammography screening is the only reliable means for early detection. Mass screening has many shortcomings which could be addressed by a computer-aided mammographic screening system. Accordingly, we have applied the pattern recognition methods developed in earlier investigations of speculated lesions in mammograms to the detection of microcalcifications and circumscribed masses, generating new, more rigorous and uniform methods for the detection of both those signs. We have also improved the pattern recognition methods themselves, through the development of a new approach to combinations of multiple classifiers.

  20. Integrated Hardware and Software for No-Loss Computing

    Science.gov (United States)

    James, Mark

    2007-01-01

    When an algorithm is distributed across multiple threads executing on many distinct processors, a loss of one of those threads or processors can potentially result in the total loss of all the incremental results up to that point. When implementation is massively hardware distributed, then the probability of a hardware failure during the course of a long execution is potentially high. Traditionally, this problem has been addressed by establishing checkpoints where the current state of some or part of the execution is saved. Then in the event of a failure, this state information can be used to recompute that point in the execution and resume the computation from that point. A serious problem arises when one distributes a problem across multiple threads and physical processors is that one increases the likelihood of the algorithm failing due to no fault of the scientist but as a result of hardware faults coupled with operating system problems. With good reason, scientists expect their computing tools to serve them and not the other way around. What is novel here is a unique combination of hardware and software that reformulates an application into monolithic structure that can be monitored in real-time and dynamically reconfigured in the event of a failure. This unique reformulation of hardware and software will provide advanced aeronautical technologies to meet the challenges of next-generation systems in aviation, for civilian and scientific purposes, in our atmosphere and in atmospheres of other worlds. In particular, with respect to NASA s manned flight to Mars, this technology addresses the critical requirements for improving safety and increasing reliability of manned spacecraft.

  1. Medicinal electrochemistry: integration of electrochemistry, medicinal chemistry and computational chemistry.

    Science.gov (United States)

    Almeida, M O; Maltarollo, V G; de Toledo, R A; Shim, H; Santos, M C; Honorio, K M

    2014-01-01

    Over the last centuries, there were many important discoveries in medicine that were crucial for gaining a better understanding of several physiological processes. Molecular modelling techniques are powerful tools that have been successfully used to analyse and interface medicinal chemistry studies with electrochemical experimental results. This special combination can help to comprehend medicinal chemistry problems, such as predicting biological activity and understanding drug action mechanisms. Electrochemistry has provided better comprehension of biological reactions and, as a result of many technological improvements, the combination of electrochemical techniques and biosensors has become an appealing choice for pharmaceutical and biomedical analyses. Therefore, this review will briefly outline the present scope and future advances related to the integration of electrochemical and medicinal chemistry approaches based on various applications from recent studies.

  2. Bio-Search Computing: integration and global ranking of bioinformatics search results.

    Science.gov (United States)

    Masseroli, Marco; Ghisalberti, Giorgio; Ceri, Stefano

    2011-09-06

    In the Life Sciences, numerous questions can be addressed only by comprehensively searching different types of data that are inherently ordered, or are associated with ranked confidence values. We previously proposed Search Computing to support the integration of the results of search engines with other data and computational resources. This paper presents how well known bioinformatics resources can be described as search services in the search computing framework and integrated analyses over such services can be carried out. An initial set of bioinformatics services has been described and registered in the search computing framework and a bioinformatics search computing (Bio-SeCo) application using these services has been created. This current prototype application, the available services that it uses, the queries that are supported, the kind of interaction that is therefore made available to the users, and the future scenarios are here described and discussed.

  3. A simulation-based robust biofuel facility location model for an integrated bio-energy logistics network

    Directory of Open Access Journals (Sweden)

    Jae-Dong Hong

    2014-10-01

    Full Text Available Purpose: The purpose of this paper is to propose a simulation-based robust biofuel facility location model for solving an integrated bio-energy logistics network (IBLN problem, where biomass yield is often uncertain or difficult to determine.Design/methodology/approach: The IBLN considered in this paper consists of four different facilities: farm or harvest site (HS, collection facility (CF, biorefinery (BR, and blending station (BS. Authors propose a mixed integer quadratic modeling approach to simultaneously determine the optimal CF and BR locations and corresponding biomass and bio-energy transportation plans. The authors randomly generate biomass yield of each HS and find the optimal locations of CFs and BRs for each generated biomass yield, and select the robust locations of CFs and BRs to show the effects of biomass yield uncertainty on the optimality of CF and BR locations. Case studies using data from the State of South Carolina in the United State are conducted to demonstrate the developed model’s capability to better handle the impact of uncertainty of biomass yield.Findings: The results illustrate that the robust location model for BRs and CFs works very well in terms of the total logistics costs. The proposed model would help decision-makers find the most robust locations for biorefineries and collection facilities, which usually require huge investments, and would assist potential investors in identifying the least cost or important facilities to invest in the biomass and bio-energy industry.Originality/value: An optimal biofuel facility location model is formulated for the case of deterministic biomass yield. To improve the robustness of the model for cases with probabilistic biomass yield, the model is evaluated by a simulation approach using case studies. The proposed model and robustness concept would be a very useful tool that helps potential biofuel investors minimize their investment risk.

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  5. A new framework to integrate wireless sensor networks with cloud computing

    Science.gov (United States)

    Shah, Sajjad Hussain; Khan, Fazle Kabeer; Ali, Wajid; Khan, Jamshed

    Wireless sensors networks have several applications of their own. These applications can further enhanced by integrating a local wireless sensor network to internet, which can be used in real time applications where the results of sensors are stored on the cloud. We propose an architecture that integrates a wireless sensor network to the internet using cloud technology. The resultant system is proved to be reliable, available and extensible. In this paper a new framework is proposed for WSN integration with Cloud computing model, existing WSN will be connected to the proposed framework. Three deployment layer are used to serve user request (IaaS, PaaS, SaaS) either from the library which is made from data collected from data centric DC by WSN periodically. The integration controller unit of the proposed framework integrates the sensor network and cloud computing technology which offers reliability, availability and extensibility.

  6. Development and evaluation of an integrated emergency response facility location model

    Directory of Open Access Journals (Sweden)

    Jae-Dong Hong

    2012-06-01

    Full Text Available Purpose: The purpose of this paper is to propose and compare the performance of the “two” robust mathematical models, the Robust Integer Facility Location (RIFL and the Robust Continuous Facility Location (RCFL models, to solve the emergency response facility and transportation problems in terms of the total logistics cost and robustness. Design/methodology/approach: The emergency response facilities include distribution warehouses (DWH where relief goods are stored, commodity distribution points (CDP, and neighborhood locations. Authors propose two robust models: the Robust Integer Facility Location (RIFL model where the demand of a CDP is covered by a main DWH or a backup CDP; the Robust Continuous Facility Location (RCFL model where that of a CDP is covered by multiple DWHs. The performance of these models is compared with each other and to the Regular Facility Location (RFL model where a CDP is covered by one main DWH. The case studies with multiple scenarios are analyzed. Findings: The results illustrate that the RFL outperforms others under normal conditions while the RCFL outperforms others under the emergency conditions. Overall, the total logistics cost and robustness level of the RCFL outperforms those of other models while the performance of RFL and RIFL is mixed between the cost and robustness index. Originality/value: Two new emergency distribution approaches are modeled, and evaluated using case studies. In addition to the total logistics cost, the robustness index is uniquely presented and applied. The proposed models and robustness concept are hoped to shed light to the future works in the field of disaster logistics management.

  7. Computational Studies of X-ray Framing Cameras for the National Ignition Facility

    Science.gov (United States)

    2013-06-01

    Livermore National Laboratory 7000 East Avenue Livermore, CA 94550 USA Abstract The NIF is the world’s most powerful laser facility and is...a phosphor screen where the output is recorded. The x-ray framing cameras have provided excellent information. As the yields at NIF have increased...experiments on the NIF . The basic operation of these cameras is shown in Fig. 1. Incident photons generate photoelectrons both in the pores of the MCP and

  8. Computing multiple aggregation levels and contextual features for road facilities recognition using mobile laser scanning data

    Science.gov (United States)

    Yang, Bisheng; Dong, Zhen; Liu, Yuan; Liang, Fuxun; Wang, Yongjun

    2017-04-01

    In recent years, updating the inventory of road infrastructures based on field work is labor intensive, time consuming, and costly. Fortunately, vehicle-based mobile laser scanning (MLS) systems provide an efficient solution to rapidly capture three-dimensional (3D) point clouds of road environments with high flexibility and precision. However, robust recognition of road facilities from huge volumes of 3D point clouds is still a challenging issue because of complicated and incomplete structures, occlusions and varied point densities. Most existing methods utilize point or object based features to recognize object candidates, and can only extract limited types of objects with a relatively low recognition rate, especially for incomplete and small objects. To overcome these drawbacks, this paper proposes a semantic labeling framework by combing multiple aggregation levels (point-segment-object) of features and contextual features to recognize road facilities, such as road surfaces, road boundaries, buildings, guardrails, street lamps, traffic signs, roadside-trees, power lines, and cars, for highway infrastructure inventory. The proposed method first identifies ground and non-ground points, and extracts road surfaces facilities from ground points. Non-ground points are segmented into individual candidate objects based on the proposed multi-rule region growing method. Then, the multiple aggregation levels of features and the contextual features (relative positions, relative directions, and spatial patterns) associated with each candidate object are calculated and fed into a SVM classifier to label the corresponding candidate object. The recognition performance of combining multiple aggregation levels and contextual features was compared with single level (point, segment, or object) based features using large-scale highway scene point clouds. Comparative studies demonstrated that the proposed semantic labeling framework significantly improves road facilities recognition

  9. Research on common methods for evaluating the operation effect of integrated wastewater treatment facilities of iron and steel enterprises

    Science.gov (United States)

    Bingsheng, Xu

    2017-04-01

    Considering the large quantities of wastewater generated from iron and steel enterprises in China, this paper is aimed to research the common methods applied for evaluating the integrated wastewater treatment effect of iron and steel enterprises. Based on survey results on environmental protection performance, technological economy, resource & energy consumption, services and management, an indicator system for evaluating the operation effect of integrated wastewater treatment facilities is set up. By discussing the standards and industrial policies in and out of China, 27 key secondary indicators are further defined on the basis of investigation on main equipment and key processes for wastewater treatment, so as to determine the method for setting key quantitative and qualitative indicators for evaluation indicator system. It is also expected to satisfy the basic requirements of reasonable resource allocation, environmental protection and sustainable economic development, further improve the integrated wastewater treatment effect of iron and steel enterprises, and reduce the emission of hazardous substances and environmental impact.

  10. An integrated computer control system for the ANU linac

    Science.gov (United States)

    Davidson, P. M.; Foote, G. S.

    1996-02-01

    One facet of the installation of the superconducting linac at the ANU is the need for computer control of a variety of systems, such as beam transport, resonator RF, cryogenics and others. To accommodate this, a number of control interfaces (for example, analogue signals and RS232 serial lines) must be employed. Ideally, all of the systems should be able to be controlled from a central location, remote from the actual devices. To this end a system based around VAX computers and VME crates has been designed and is currently being developed and implemented. A VAXstation is used to issue control messages and perform high-level functions, while VME crates containing appropriate modules (primarily DACs, ADCs and digital I/O boards) control the devices. The controllers in the VME crates are AEON rtVAX modules running a real-time operating system. Communication with the VAXstation is via DECnet, on a private ethernet to allow communication rates unaffected by unrelated network activity and potentially increasing the security of the system by providing a possible network isolation point. Also on this ethernet are a number of terminal servers to control RS232 devices. A central database contains all device control and monitoring parameters. The main control process running on the VAXstation is responsible for maintaining the current values of the parameters in the database and for dispatching control messages to the appropriate VME crate or RS232 serial line. Separate graphical interface processes allow the operator to interact with the control process, communicating through shared memory. Many graphics processes can be active simultaneously, displaying either on a single or on multiple terminals. Software running on the rtVAX controllers handles the low-level device-specific control by translating messages from the main control process to VME commands which set hardware outputs on VME modules. Similarly, requests for the value of a parameter result in the rtVAX program

  11. Consequence analysis in LPG installation using an integrated computer package.

    Science.gov (United States)

    Ditali, S; Colombi, M; Moreschini, G; Senni, S

    2000-01-07

    This paper presents the prototype of the computer code, Atlantide, developed to assess the consequences associated with accidental events that can occur in a LPG storage plant. The characteristic of Atlantide is to be simple enough but at the same time adequate to cope with consequence analysis as required by Italian legislation in fulfilling the Seveso Directive. The application of Atlantide is appropriate for LPG storage/transferring installations. The models and correlations implemented in the code are relevant to flashing liquid releases, heavy gas dispersion and other typical phenomena such as BLEVE/Fireball. The computer code allows, on the basis of the operating/design characteristics, the study of the relevant accidental events from the evaluation of the release rate (liquid, gaseous and two-phase) in the unit involved, to the analysis of the subsequent evaporation and dispersion, up to the assessment of the final phenomena of fire and explosion. This is done taking as reference simplified Event Trees which describe the evolution of accidental scenarios, taking into account the most likely meteorological conditions, the different release situations and other features typical of a LPG installation. The limited input data required and the automatic linking between the single models, that are activated in a defined sequence, depending on the accidental event selected, minimize both the time required for the risk analysis and the possibility of errors. Models and equations implemented in Atlantide have been selected from public literature or in-house developed software and tailored with the aim to be easy to use and fast to run but, nevertheless, able to provide realistic simulation of the accidental event as well as reliable results, in terms of physical effects and hazardous areas. The results have been compared with those of other internationally recognized codes and with the criteria adopted by Italian authorities to verify the Safety Reports for LPG

  12. Intraoperative computed tomography with integrated navigation system in spinal stabilizations.

    Science.gov (United States)

    Zausinger, Stefan; Scheder, Ben; Uhl, Eberhard; Heigl, Thomas; Morhard, Dominik; Tonn, Joerg-Christian

    2009-12-15

    STUDY DESIGN.: A prospective interventional case-series study plus a retrospective analysis of historical patients for comparison of data. OBJECTIVE.: To evaluate workflow, feasibility, and clinical outcome of navigated stabilization procedures with data acquisition by intraoperative computed tomography. SUMMARY OF BACKGROUND DATA.: Routine fluoroscopy to assess pedicle screw placement is not consistently reliable. Our hypothesis was that image-guided spinal navigation using an intraoperative CT-scanner can improve the safety and precision of spinal stabilization surgery. METHODS.: CT data of 94 patients (thoracolumbar [n = 66], C1/2 [n = 12], cervicothoracic instability [n = 16]) were acquired after positioning the patient in the final surgical position. A sliding gantry 40-slice CT was used for image acquisition. Data were imported to a frameless infrared-based neuronavigation workstation. Intraoperative CT was obtained to assess the accuracy of instrumentation and, if necessary, the extent of decompression. All patients were clinically evaluated by Odom-criteria after surgery and after 3 months. RESULTS.: Computed accuracy of the navigation system reached /=2 mm without persistent neurologic or vascular damage in 20/414 screws (4.8%) leading to immediate correction of 10 screws (2.4%). Control-iCT changed the course of surgery in 8 cases (8.5% of all patients). The overall revision rate was 8.5% (4 wound revisions, 2 CSF fistulas, and 2 epidural hematomas). There was no reoperation due to implant malposition. According to Odom-criteria all patients experienced a clinical improvement. A retrospective analysis of 182 patients with navigated thoracolumbar transpedicular stabilizations in the preiCT era revealed an overall revision rate of 10.4% with 4.4% of patients requiring screw revision. CONCLUSION.: Intraoperative CT in combination with neuronavigation provides high accuracy of screw placement and thus safety for patients undergoing spinal stabilization

  13. Experimental assessment of computer codes used for safety analysis of integral reactors

    Energy Technology Data Exchange (ETDEWEB)

    Falkov, A.A.; Kuul, V.S.; Samoilov, O.B. [OKB Mechanical Engineering, Nizhny Novgorod (Russian Federation)

    1995-09-01

    Peculiarities of integral reactor thermohydraulics in accidents are associated with presence of noncondensable gas in built-in pressurizer, absence of pumped ECCS, use of guard vessel for LOCAs localisation and passive RHRS through in-reactor HX`s. These features defined the main trends in experimental investigations and verification efforts for computer codes applied. The paper reviews briefly the performed experimental investigation of thermohydraulics of AST-500, VPBER600-type integral reactors. The characteristic of UROVEN/MB-3 code for LOCAs analysis in integral reactors and results of its verification are given. The assessment of RELAP5/mod3 applicability for accident analysis in integral reactor is presented.

  14. The Computer Aided Integrated Design and Analysis of Oil Tank in Vehicle

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper creates 3D solid model and assembly of U RJ 92-6 oil tank and analyses its strength by integrated CAD/CAE/CAM software I-D EAS. Through integrated simulation in computer, design efficiency and quality of oil tank is greatly improved. Adopting integrated CAD/CAE/CAM software to carry out integrated research to equ ipment and products, we will be able to take overall analysis in aspects of 3-D solid modeling, pre-assembly and strength, etc., to realize non-paper designi ng and parallel designing...

  15. Second Generation Integrated Composite Analyzer (ICAN) Computer Code

    Science.gov (United States)

    Murthy, Pappu L. N.; Ginty, Carol A.; Sanfeliz, Jose G.

    1993-01-01

    This manual updates the original 1986 NASA TP-2515, Integrated Composite Analyzer (ICAN) Users and Programmers Manual. The various enhancements and newly added features are described to enable the user to prepare the appropriate input data to run this updated version of the ICAN code. For reference, the micromechanics equations are provided in an appendix and should be compared to those in the original manual for modifications. A complete output for a sample case is also provided in a separate appendix. The input to the code includes constituent material properties, factors reflecting the fabrication process, and laminate configuration. The code performs micromechanics, macromechanics, and laminate analyses, including the hygrothermal response of polymer-matrix-based fiber composites. The output includes the various ply and composite properties, the composite structural response, and the composite stress analysis results with details on failure. The code is written in FORTRAN 77 and can be used efficiently as a self-contained package (or as a module) in complex structural analysis programs. The input-output format has changed considerably from the original version of ICAN and is described extensively through the use of a sample problem.

  16. Hospital Palliative Care Teams and Post-Acute Care in Nursing Facilities: An Integrative Review.

    Science.gov (United States)

    Carpenter, Joan G

    2017-01-01

    Although palliative care consultation teams are common in U.S. hospitals, follow up and outcomes of consultations for frail older adults discharged to nursing facilities are unclear. To summarize and critique research on the care of patients discharged to nursing facilities following a hospital-based palliative care consult, a systematic search of PubMed, CINAHL, Ageline, and PsycINFO was conducted in February 2016. Data from the articles (N = 12) were abstracted and analyzed. The results of 12 articles reflecting research conducted in five countries are presented in narrative form. Two studies focused on nurse perceptions only, three described patient/family/caregiver experiences and needs, and seven described patient-focused outcomes. Collectively, these articles demonstrate that disruption in palliative care service on hospital discharge and nursing facility admission may result in high symptom burden, poor communication, and inadequate coordination of care. High mortality was also noted. [Res Gerontol Nurs. 2017; 10(1):25-34.].

  17. Undulator beamline optimization with integrated chicanes for X-ray free-electron-laser facilities.

    Science.gov (United States)

    Prat, Eduard; Calvi, Marco; Ganter, Romain; Reiche, Sven; Schietinger, Thomas; Schmidt, Thomas

    2016-07-01

    An optimization of the undulator layout of X-ray free-electron-laser (FEL) facilities based on placing small chicanes between the undulator modules is presented. The installation of magnetic chicanes offers the following benefits with respect to state-of-the-art FEL facilities: reduction of the required undulator length to achieve FEL saturation, improvement of the longitudinal coherence of the FEL pulses, and the ability to produce shorter FEL pulses with higher power levels. Numerical simulations performed for the soft X-ray beamline of the SwissFEL facility show that optimizing the advantages of the layout requires shorter undulator modules than the standard ones. This proposal allows a very compact undulator beamline that produces fully coherent FEL pulses and it makes possible new kinds of experiments that require very short and high-power FEL pulses.

  18. Facility Registry Service (FRS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Facility Registry Service (FRS) provides an integrated source of comprehensive (air, water, and waste) environmental information about facilities across EPA,...

  19. Safeguards-by-Design: Early Integration of Physical Protection and Safeguardability into Design of Nuclear Facilities

    Energy Technology Data Exchange (ETDEWEB)

    T. Bjornard; R. Bean; S. DeMuth; P. Durst; M. Ehinger; M. Golay; D. Hebditch; J. Hockert; J. Morgan

    2009-09-01

    The application of a Safeguards-by-Design (SBD) process for new nuclear facilities has the potential to minimize proliferation and security risks as the use of nuclear energy expands worldwide. This paper defines a generic SBD process and its incorporation from early design phases into existing design / construction processes and develops a framework that can guide its institutionalization. SBD could be a basis for a new international norm and standard process for nuclear facility design. This work is part of the U.S. DOE’s Next Generation Safeguards Initiative (NGSI), and is jointly sponsored by the Offices of Non-proliferation and Nuclear Energy.

  20. Critical Issues Forum: A multidisciplinary educational program integrating computer technology

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, R.J.; Robertson, B.; Jacobs, D. [Los Alamos National Lab., NM (United States)

    1998-09-01

    The Critical Issues Forum (CIF) funded by the US Department of Energy is a collaborative effort between the Science Education Team of Los Alamos National Laboratory (LANL) and New Mexico high schools to improve science education throughout the state of New Mexico as well as nationally. By creating an education relationship between the LANL with its unique scientific resources and New Mexico high schools, students and teachers participate in programs that increase not only their science content knowledge but also their critical thinking and problem-solving skills. The CIF program focuses on current, globally oriented topics crucial to the security of not only the US but to that of all nations. The CIF is an academic-year program that involves both teachers and students in the process of seeking solutions for real world concerns. Built around issues tied to LANL`s mission, participating students and teachers are asked to critically investigate and examine the interactions among the political, social, economic, and scientific domains while considering diversity issues that include geopolitical entities and cultural and ethnic groupings. Participants are expected to collaborate through telecommunications during the research phase and participate in a culminating multimedia activity, where they produce and deliver recommendations for the current issues being studied. The CIF was evaluated and found to be an effective approach for teacher professional training, especially in the development of skills for critical thinking and questioning. The CIF contributed to students` ability to integrate diverse disciplinary content about science-related topics and supported teachers in facilitating the understanding of their students using the CIF approach. Networking technology in CIF has been used as an information repository, resource delivery mechanism, and communication medium.

  1. Performance analysis of three dimensional integral equation computations on a massively parallel computer. M.S. Thesis

    Science.gov (United States)

    Logan, Terry G.

    1994-01-01

    The purpose of this study is to investigate the performance of the integral equation computations using numerical source field-panel method in a massively parallel processing (MPP) environment. A comparative study of computational performance of the MPP CM-5 computer and conventional Cray-YMP supercomputer for a three-dimensional flow problem is made. A serial FORTRAN code is converted into a parallel CM-FORTRAN code. Some performance results are obtained on CM-5 with 32, 62, 128 nodes along with those on Cray-YMP with a single processor. The comparison of the performance indicates that the parallel CM-FORTRAN code near or out-performs the equivalent serial FORTRAN code for some cases.

  2. Scheduling of a computer integrated manufacturing system: A simulation study

    Directory of Open Access Journals (Sweden)

    Nadia Bhuiyan

    2011-12-01

    Full Text Available Purpose: The purpose of this paper is to study the effect of selected scheduling dispatching rules on the performance of an actual CIM system using different performance measures and to compare the results with the literature.Design/methodology/approach: To achieve this objective, a computer simulation model of the existing CIM system is developed to test the performance of different scheduling rules with respect to mean flow time, machine efficiency and total run time as performance measures.Findings: Results suggest that the system performs much better considering the machine efficiency when the initial number of parts released is maximum and the buffer size is minimum. Furthermore, considering the average flow time, the system performs much better when the selected dispatching rule is either Earliest Due Date (EDD or Shortest Process Time (SPT with buffer size of five and the initial number of parts released of eight.Research limitations/implications: In this research, some limitations are: a limited number of factors and levels were considered for the experiment set-up; however the flexibility of the model allows experimenting with additional factors and levels. In the simulation experiments of this research, three scheduling dispatching rules (First In/First Out (FIFO, EDD, SPT were used. In future research, the effect of other dispatching rules on the system performance can be compared. Some assumptions can be relaxed in future work.Practical implications: This research helps to identify the potential effect of a selected number of dispatching rules and two other factors, the number of buffers and initial number of parts released, on the performance of the existing CIM systems with different part types where the machines are the major resource constraints.Originality/value: This research is among the few to study the effect of the dispatching rules on the performance of the CIM systems with use of terminating simulation analysis. This is

  3. High speed image space parallel processing for computer-generated integral imaging system.

    Science.gov (United States)

    Kwon, Ki-Chul; Park, Chan; Erdenebat, Munkh-Uchral; Jeong, Ji-Seong; Choi, Jeong-Hun; Kim, Nam; Park, Jae-Hyeung; Lim, Young-Tae; Yoo, Kwan-Hee

    2012-01-16

    In an integral imaging display, the computer-generated integral imaging method has been widely used to create the elemental images from a given three-dimensional object data. Long processing time, however, has been problematic especially when the three-dimensional object data set or the number of the elemental lenses are large. In this paper, we propose an image space parallel processing method, which is implemented by using Open Computer Language (OpenCL) for rapid generation of the elemental images sets from large three-dimensional volume data. Using the proposed technique, it is possible to realize a real-time interactive integral imaging display system for 3D volume data constructed from computational tomography (CT) or magnetic resonance imaging (MRI) data.

  4. Computational Analyses in Support of Sub-scale Diffuser Testing for the A-3 Facility. Part 1; Steady Predictions

    Science.gov (United States)

    Allgood, Daniel C.; Graham, Jason S.; Ahuja, Vineet; Hosangadi, Ashvin

    2008-01-01

    levels in CFD based flowpath modeling of the facility. The analyses tools used here expand on the multi-element unstructured CFD which has been tailored and validated for impingement dynamics of dry plumes, complex valve/feed systems, and high pressure propellant delivery systems used in engine and component test stands at NASA SSC. The analyses performed in the evaluation of the sub-scale diffuser facility explored several important factors that influence modeling and understanding of facility operation such as (a) importance of modeling the facility with Real Gas approximation, (b) approximating the cluster of steam ejector nozzles as a single annular nozzle, (c) existence of mixed subsonic/supersonic flow downstream of the turning duct, and (d) inadequacy of two-equation turbulence models in predicting the correct pressurization in the turning duct and expansion of the second stage steam ejectors. The procedure used for modeling the facility was as follows: (i) The engine, test cell and first stage ejectors were simulated with an axisymmetric approximation (ii) the turning duct, second stage ejectors and the piping downstream of the second stage ejectors were analyzed with a three-dimensional simulation utilizing a half-plane symmetry approximation. The solution i.e. primitive variables such as pressure, velocity components, temperature and turbulence quantities were passed from the first computational domain and specified as a supersonic boundary condition for the second simulation. (iii) The third domain comprised of the exit diffuser and the region in the vicinity of the facility (primary included to get the correct shock structure at the exit of the facility and entrainment characteristics). The first set of simulations comprising the engine, test cell and first stage ejectors was carried out both as a turbulent real gas calculation as well as a turbulent perfect gas calculation. A comparison for the two cases (Real Turbulent and Perfect gas turbulent) of the Ma

  5. The MIT Integrated Global System Model: A facility for Assessing and Communicating Climate Change Uncertainty (Invited)

    Science.gov (United States)

    Prinn, R. G.

    2013-12-01

    The world is facing major challenges that create tensions between human development and environmental sustenance. In facing these challenges, computer models are invaluable tools for addressing the need for probabilistic approaches to forecasting. To illustrate this, I use the MIT Integrated Global System Model framework (IGSM; http://globalchange.mit.edu ). The IGSM consists of a set of coupled sub-models of global economic and technological development and resultant emissions, and physical, dynamical and chemical processes in the atmosphere, land, ocean and ecosystems (natural and managed). Some of the sub-models have both complex and simplified versions available, with the choice of which version to use being guided by the questions being addressed. Some sub-models (e.g.urban air pollution) are reduced forms of complex ones created by probabilistic collocation with polynomial chaos bases. Given the significant uncertainties in the model components, it is highly desirable that forecasts be probabilistic. We achieve this by running 400-member ensembles (Latin hypercube sampling) with different choices for key uncertain variables and processes within the human and natural system model components (pdfs of inputs estimated by model-observation comparisons, literature surveys, or expert elicitation). The IGSM has recently been used for probabilistic forecasts of climate, each using 400-member ensembles: one ensemble assumes no explicit climate mitigation policy and others assume increasingly stringent policies involving stabilization of greenhouse gases at various levels. These forecasts indicate clearly that the greatest effect of these policies is to lower the probability of extreme changes. The value of such probability analyses for policy decision-making lies in their ability to compare relative (not just absolute) risks of various policies, which are less affected by the earth system model uncertainties. Given the uncertainties in forecasts, it is also clear that

  6. 78 FR 7334 - Port Authority Access to Facility Vulnerability Assessments and the Integration of Security Systems

    Science.gov (United States)

    2013-02-01

    ... not have access to the Internet, you may view the docket online by visiting the Docket Management..., emergency preparedness and response, and communications capabilities (33 CFR 105.305). Facility Security... about the costs associated with these approaches as well as any potential benefit. These comments...

  7. DOE standard: Integration of environment, safety, and health into facility disposition activities. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-05-01

    This volume contains the appendices that provide additional environment, safety, and health (ES and H) information to complement Volume 1 of this Standard. Appendix A provides a set of candidate DOE ES and H directives and external regulations, organized by hazard types that may be used to identify potentially applicable directives to a specific facility disposition activity. Appendix B offers examples and lessons learned that illustrate implementation of ES and H approaches discussed in Section 3 of Volume 1. Appendix C contains ISMS performance expectations to guide a project team in developing and implementing an effective ISMS and in developing specific performance criteria for use in facility disposition. Appendix D provides guidance for identifying potential Applicable or Relevant and Appropriate Requirements (ARARs) when decommissioning facilities fall under the Comprehensive Environmental Response, Compensation, Liability Act (CERCLA) process. Appendix E discusses ES and H considerations for dispositioning facilities by privatization. Appendix F is an overview of the WSS process. Appendix G provides a copy of two DOE Office of Nuclear Safety Policy and Standards memoranda that form the bases for some of the guidance discussed within the Standard. Appendix H gives information on available hazard analysis techniques and references. Appendix I provides a supplemental discussion to Sections 3.3.4, Hazard Baseline Documentation, and 3.3.6, Environmental Permits. Appendix J presents a sample readiness evaluation checklist.

  8. Injection and extraction computer control system HIRFL-SSC The HIRFL-SSC is stated for Heavy Ion Research Facility of Lanzhou-Separated Sector Cyclotron

    CERN Document Server

    Zhang Wei; Chen Yun; Zhang Xia; Hu Jian Jun; Xu Xing Ming

    2002-01-01

    The injection and extraction computer control system of HIRFL-SSC (Heavy Ion Research Facility of Lanzhou-Separated Sector Cyclotron) have been introduced. Software is described briefly. Hardware structure is mainly presented. The computer control system realize that the adjustment of injection and extraction can done by PC and operate interface is Windows style. The system can make the adjustment convenient and veracious

  9. Integrated O&M for energy generation and exchange facilities; O&M integral para instalaciones de generación e intercambio de energía

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2016-07-01

    Ingeteam Service, part of the Ingeteam Group, is a leading company in the provision of integrated O&M services at energy generation and exchange facilities worldwide. From its head office in the Albacete Science and Technology Park, it manages the work of the 1,300 employees that make up its global workforce, rendering services to wind farms, PV installations and power generation plants. In addition, it maintains an active participation strategy in a range of R&D+i programmes that improve the existing technologies and are geared towards new production systems and new diagnostic techniques, applied to renewables installation maintenance. (Author)

  10. A computer-controlled experimental facility for krypton and xenon adsorption coefficient measurements on activated carbons

    Energy Technology Data Exchange (ETDEWEB)

    Del Serra, Daniele; Aquaro, Donato; Mazed, Dahmane; Pazzagli, Fabio; Ciolini, Riccardo, E-mail: r.ciolini@ing.unipi.it

    2015-07-15

    Highlights: • An experimental test facility for qualification of the krypton and xenon adsorption properties of activated carbons. • The measurement of the adsorption coefficient by using the elution curve method. • The simultaneous on-line control of the main physical parameters influencing the adsorption property of activated carbon. - Abstract: An automated experimental test facility, intended specifically for qualification of the krypton and xenon adsorption properties of activated carbon samples, was designed and constructed. The experimental apparatus was designed to allow an on-line control of the main physical parameters influencing greatly the adsorption property of activated carbon. The measurement of the adsorption coefficient, based upon the elution curve method, can be performed with a precision better than 5% at gas pressure values ranging from atmospheric pressure up to 9 bar and bed temperature from 0 up to 80 °C. The carrier gas flow rate can be varied from 40 up to 4000 N cm{sup 3} min{sup −1} allowing measurement of dynamic adsorption coefficient with face velocities from 0.3 up to 923 cm min{sup −1} depending on the gas pressure and the test cell being used. The moisture content of the activated carbon can be precisely controlled during measurement, through the relative humidity of the carrier gas.

  11. AVES: A high performance computer cluster array for the INTEGRAL satellite scientific data analysis

    Science.gov (United States)

    Federici, Memmo; Martino, Bruno Luigi; Ubertini, Pietro

    2012-07-01

    In this paper we describe a new computing system array, designed, built and now used at the Space Astrophysics and Planetary Institute (IAPS) in Rome, Italy, for the INTEGRAL Space Observatory scientific data analysis. This new system has become necessary in order to reduce the processing time of the INTEGRAL data accumulated during the more than 9 years of in-orbit operation. In order to fulfill the scientific data analysis requirements with a moderately limited investment the starting approach has been to use a `cluster' array of commercial quad-CPU computers, featuring the extremely large scientific and calibration data archive on line.

  12. Computer algebra in quantum field theory integration, summation and special functions

    CERN Document Server

    Schneider, Carsten

    2013-01-01

    The book focuses on advanced computer algebra methods and special functions that have striking applications in the context of quantum field theory. It presents the state of the art and new methods for (infinite) multiple sums, multiple integrals, in particular Feynman integrals, difference and differential equations in the format of survey articles. The presented techniques emerge from interdisciplinary fields: mathematics, computer science and theoretical physics; the articles are written by mathematicians and physicists with the goal that both groups can learn from the other field, including

  13. AI/OR computational model for integrating qualitative and quantitative design methods

    Science.gov (United States)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  14. AI/OR computational model for integrating qualitative and quantitative design methods

    Science.gov (United States)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  15. Laboratory Testing of Bulk Vitrified Low-Activity Waste Forms to Support the 2005 Integrated Disposal Facility Performance Assessment Erratum

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Gary L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-09-06

    This report refers to or contains Kg values for glasses LAWA44, LAWB45 and LAWC22 affected by calculations errors as identified by Papathanassiu et al. (2011). The corrected Kg values are reported in an erratum included in the revised version of the original report. The revised report can be referenced as follows: Pierce E. M. et al. (2004) Waste Form Release Data Package for the 2005 Integrated Disposal Facility Performance Assessment. PNNL-14805 Rev. 0 Erratum. Pacific Northwest National Laboratory, Richland, WA, USA.

  16. Laboratory Testing of Bulk Vitrified Low-Activity Waste Forms to Support the 2005 Integrated Disposal Facility Performance Assessment. Erratum

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Gary L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-09-06

    This report refers to or contains Kg values for glasses LAWA44, LAWB45 and LAWC22 affected by calculations errors as identified by Papathanassiu et al. (2011). The corrected Kg values are reported in an erratum included in the revised version of the original report. The revised report can be referenced as follows: Pierce E. M. et al. (2004) Waste Form Release Data Package for the 2005 Integrated Disposal Facility Performance Assessment. PNNL-14805 Rev. 0 Erratum. Pacific Northwest National Laboratory, Richland, WA, USA.

  17. The Use of Public Computing Facilities by Library Patrons: Demography, Motivations, and Barriers

    Science.gov (United States)

    DeMaagd, Kurt; Chew, Han Ei; Huang, Guanxiong; Khan, M. Laeeq; Sreenivasan, Akshaya; LaRose, Robert

    2013-01-01

    Public libraries play an important part in the development of a community. Today, they are seen as more than store houses of books; they are also responsible for the dissemination of online, and offline information. Public access computers are becoming increasingly popular as more and more people understand the need for internet access. Using a…

  18. Computer Simulation of an Anesthesia Service at a U.S. Army Medical Treatment Facility

    Science.gov (United States)

    1999-08-01

    Anesthesia Simulation Study 1 Running head : ANESTHESIA SIMULATION Computer Simulation of an Anesthesia Service at a U.S. Army Medical Treatment...bettering marketing efforts). There are several articles that address staffing from the perspective of what type of provider is the most cost

  19. It Takes Glue to Tango: MeDICi integration framework creates data-intensive computing pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Gorton, Ian; Oehmen, Christopher S.; McDermott, Jason E.

    2008-11-01

    Biologists increasingly rely on high-performance computing (HPC) platforms to rapidly process the tsunami of data generated by high throughput genome and metagenome sequencing technology and high-throughput proteomics. Unfortunately, the platforms that produce the massive data sets rarely work smoothly with the interactive analysis and visualization programs used in bioinformatics. This makes it difficult for researchers to exploit the computational power of HPC platforms to speed scientific discovery. At the Department of Energy’s Pacific Northwest National Laboratory in Richland, Wash., researchers are creating computing environments for biologists that seamlessly integrate collections of data and computational resources. These advantages enable users to rapidly analyze high-throughput data. A major goal is to shield the biologist from the complexity of interacting with multiple dissimilar databases and running tasks on HPC platforms and computational clusters. One of those environments the MeDICi Integration Framework is now available for free download. Short for Middleware for Data-Intensive Computing, MeDICi makes it easy to integrate separate codes into complex applications that operate as a data analysis pipeline.

  20. Integral Images: Efficient Algorithms for Their Computation and Storage in Resource-Constrained Embedded Vision Systems

    Directory of Open Access Journals (Sweden)

    Shoaib Ehsan

    2015-07-01

    Full Text Available The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF, allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video. Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44% in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems.

  1. A specialized ODE integrator for the efficient computation of parameter sensitivities

    Directory of Open Access Journals (Sweden)

    Gonnet Pedro

    2012-05-01

    Full Text Available Abstract Background Dynamic mathematical models in the form of systems of ordinary differential equations (ODEs play an important role in systems biology. For any sufficiently complex model, the speed and accuracy of solving the ODEs by numerical integration is critical. This applies especially to systems identification problems where the parameter sensitivities must be integrated alongside the system variables. Although several very good general purpose ODE solvers exist, few of them compute the parameter sensitivities automatically. Results We present a novel integration algorithm that is based on second derivatives and contains other unique features such as improved error estimates. These features allow the integrator to take larger time steps than other methods. In practical applications, i.e. systems biology models of different sizes and behaviors, the method competes well with established integrators in solving the system equations, and it outperforms them significantly when local parameter sensitivities are evaluated. For ease-of-use, the solver is embedded in a framework that automatically generates the integrator input from an SBML description of the system of interest. Conclusions For future applications, comparatively ‘cheap’ parameter sensitivities will enable advances in solving large, otherwise computationally expensive parameter estimation and optimization problems. More generally, we argue that substantially better computational performance can be achieved by exploiting characteristics specific to the problem domain; elements of our methods such as the error estimation could find broader use in other, more general numerical algorithms.

  2. Integral Images: Efficient Algorithms for Their Computation and Storage in Resource-Constrained Embedded Vision Systems.

    Science.gov (United States)

    Ehsan, Shoaib; Clark, Adrian F; Naveed ur Rehman; McDonald-Maier, Klaus D

    2015-07-10

    The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF), allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video). Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44%) in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems.

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  4. Royal Military College of Canada SLOWPOKE-2 facility. Integrated regulating and instrumentation system (SIRCIS) upgrade project

    Energy Technology Data Exchange (ETDEWEB)

    Corcoran, W.P.; Nielsen, K.S.; Kelly, D.G.; Weir, R.D. [Royal Military College of Canada (RMCC), Kingston, Ontario (Canada)

    2013-07-01

    The SLOWPOKE-2 Facility at the Royal Military College of Canada has operated the only digitally controlled SLOWPOKE reactor since 2001 (Version 1.0). The present work describes ongoing project development to provide a robust digital reactor control system that is consistent with Aging Management as summarized in the Facility's Life Cycle Management and Maintenance Plan. The project has transitioned from a post-graduate research activity to a comprehensively managed project supported by a team of RMCC professional and technical staff who have delivered an update of the V1.1 system software and hardware implementation that is consistent with best Canadian nuclear industry practice. The challenges associated with the implementation of Version 2.0 in February 2012, the lessons learned from this implementation, and the applications of these lessons to a redesign and rewrite of the RMCC SLOWPOKE-2 digital instrumentation and regulating system (Version 3) are discussed. (author)

  5. Energy Systems Integration Facility (ESIF) External Stakeholders Workshop: Workshop Proceedings, 9 October 2008, Golden, Colorado

    Energy Technology Data Exchange (ETDEWEB)

    Komomua, C.; Kroposki, B.; Mooney, D.; Stoffel, T.; Parsons, B.; Hammond, S.; Kutscher, C.; Remick, R.; Sverdrup, G.; Hawsey, R.; Pacheco, M.

    2009-01-01

    On October 9, 2008, NREL hosted a workshop to provide an opportunity for external stakeholders to offer insights and recommendations on the design and functionality of DOE's planned Energy Systems Infrastructure Facility (ESIF). The goal was to ensure that the planning for the ESIF effectively addresses the most critical barriers to large-scale energy efficiency (EE) and renewable energy (RE) deployment. This technical report documents the ESIF workshop proceedings.

  6. Time-integrated measurements of fusion-produced protons emitted from PF-facilities

    Science.gov (United States)

    Malinowska, A.; Szydlowski, A.; Zebrowski, J.; Sadowski, M. J.; Scholz, M.; Schmidt, H.; Karpinski, P.; Jaskola, M.; Korman, A.

    2006-01-01

    The paper reports on measurements of fusion reaction protons, which were emitted from high-current Plasma Focus discharges. The experiments were carried out on two Plasma Focus facilities (PF-360 and PF-1000) and the obtained results are compared in the paper. The paper presents some detailed maps of the fusion proton fluxes, which were recorded with the pinhole cameras. These maps show distributions and shapes of fast proton sources within the pinch plasma column.

  7. Cambridge-Cranfield High Performance Computing Facility (HPCF) purchases ten Sun Fire(TM) 15K servers to dramatically increase power of eScience research

    CERN Multimedia

    2002-01-01

    "The Cambridge-Cranfield High Performance Computing Facility (HPCF), a collaborative environment for data and numerical intensive computing privately run by the University of Cambridge and Cranfield University, has purchased 10 Sun Fire(TM) 15K servers from Sun Microsystems, Inc.. The total investment, which includes more than $40 million in Sun technology, will dramatically increase the computing power, reliability, availability and scalability of the HPCF" (1 page).

  8. Developing Mobile- and BIM-Based Integrated Visual Facility Maintenance Management System

    Directory of Open Access Journals (Sweden)

    Yu-Cheng Lin

    2013-01-01

    Full Text Available Facility maintenance management (FMM has become an important topic for research on the operation phase of the construction life cycle. Managing FMM effectively is extremely difficult owing to various factors and environments. One of the difficulties is the performance of 2D graphics when depicting maintenance service. Building information modeling (BIM uses precise geometry and relevant data to support the maintenance service of facilities depicted in 3D object-oriented CAD. This paper proposes a new and practical methodology with application to FMM using BIM technology. Using BIM technology, this study proposes a BIM-based facility maintenance management (BIMFMM system for maintenance staff in the operation and maintenance phase. The BIMFMM system is then applied in selected case study of a commercial building project in Taiwan to verify the proposed methodology and demonstrate its effectiveness in FMM practice. Using the BIMFMM system, maintenance staff can access and review 3D BIM models for updating related maintenance records in a digital format. Moreover, this study presents a generic system architecture and its implementation. The combined results demonstrate that a BIMFMM-like system can be an effective visual FMM tool.

  9. Developing mobile- and BIM-based integrated visual facility maintenance management system.

    Science.gov (United States)

    Lin, Yu-Cheng; Su, Yu-Chih

    2013-01-01

    Facility maintenance management (FMM) has become an important topic for research on the operation phase of the construction life cycle. Managing FMM effectively is extremely difficult owing to various factors and environments. One of the difficulties is the performance of 2D graphics when depicting maintenance service. Building information modeling (BIM) uses precise geometry and relevant data to support the maintenance service of facilities depicted in 3D object-oriented CAD. This paper proposes a new and practical methodology with application to FMM using BIM technology. Using BIM technology, this study proposes a BIM-based facility maintenance management (BIMFMM) system for maintenance staff in the operation and maintenance phase. The BIMFMM system is then applied in selected case study of a commercial building project in Taiwan to verify the proposed methodology and demonstrate its effectiveness in FMM practice. Using the BIMFMM system, maintenance staff can access and review 3D BIM models for updating related maintenance records in a digital format. Moreover, this study presents a generic system architecture and its implementation. The combined results demonstrate that a BIMFMM-like system can be an effective visual FMM tool.

  10. MSE-THERMO: Integrated computer system for application of chemical thermodynamics in materials science and engineering

    Energy Technology Data Exchange (ETDEWEB)

    Leitner, J.; Chuchvalec, P.; Vonka, P. [Inst. of Chemical Technology, Prague (Czech Republic)

    1995-08-01

    MSE-THERMO is an integrated computer system embodying thermochemical databases with sophisticated computational software for diverse thermodynamic calculations. It consists of a database MSE-DATA, where thermodynamic data for pure substances are stored, as well as programs for the calculation of thermodynamic functions of pure substances, changes of thermodynamic functions for chemical reactions, ternary phase diagrams in a subsolidus region, phase stability diagrams, and equilibrium composition of multicomponents and multiphases systems. Datafiles as well as computational software tools are at present intensively extended.

  11. An evolving infrastructure for scientific computing and the integration of new graphics technology

    Energy Technology Data Exchange (ETDEWEB)

    Fong, K.W.

    1993-02-01

    The National Energy Research Supercomputer Center (NERSC) at the Lawrence Livermore National Laboratory is currently pursuing several projects to implement and integrate new hardware and software technologies. While each of these projects ought to be and is in fact individually justifiable, there is an appealing metaphor for viewing them collectively which provides a simple and memorable way to understand the future direction not only of supercomputing services but of computer centers in general. Once this general direction is understood, it becomes clearer what future computer graphics technologies would be possible and desirable, at least within the context of large scale scientific computing.

  12. Facile implementation of integrated tempering sampling method to enhance the sampling over a broad range of temperatures

    CERN Document Server

    Zhao, Peng; Gao, Yi Qin; Lu, Zhong-Yuan

    2013-01-01

    Integrated tempering sampling (ITS) method is an approach to enhance the sampling over a broad range of energies and temperatures in computer simulations. In this paper, a new version of integrated tempering sampling method is proposed. In the new approach presented here, we obtain parameters such as the set of temperatures and the corresponding weighting factors from canonical average of potential energies. These parameters can be easily obtained without estimating partition functions. We apply this new approach to study the Lennard-Jones fluid, the ALA-PRO peptide and the single polymer chain systems to validate and benchmark the method.

  13. Computer simulated building energy consumption for verification of energy conservation measures in network facilities

    Science.gov (United States)

    Plankey, B.

    1981-01-01

    A computer program called ECPVER (Energy Consumption Program - Verification) was developed to simulate all energy loads for any number of buildings. The program computes simulated daily, monthly, and yearly energy consumption which can be compared with actual meter readings for the same time period. Such comparison can lead to validation of the model under a variety of conditions, which allows it to be used to predict future energy saving due to energy conservation measures. Predicted energy saving can then be compared with actual saving to verify the effectiveness of those energy conservation changes. This verification procedure is planned to be an important advancement in the Deep Space Network Energy Project, which seeks to reduce energy cost and consumption at all DSN Deep Space Stations.

  14. Reliability Lessons Learned From GPU Experience With The Titan Supercomputer at Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Gallarno, George [Christian Brothers University; Rogers, James H [ORNL; Maxwell, Don E [ORNL

    2015-01-01

    The high computational capability of graphics processing units (GPUs) is enabling and driving the scientific discovery process at large-scale. The world s second fastest supercomputer for open science, Titan, has more than 18,000 GPUs that computational scientists use to perform scientific simu- lations and data analysis. Understanding of GPU reliability characteristics, however, is still in its nascent stage since GPUs have only recently been deployed at large-scale. This paper presents a detailed study of GPU errors and their impact on system operations and applications, describing experiences with the 18,688 GPUs on the Titan supercom- puter as well as lessons learned in the process of efficient operation of GPUs at scale. These experiences are helpful to HPC sites which already have large-scale GPU clusters or plan to deploy GPUs in the future.

  15. Efficient coupling integrals computation of waveguide step discontinuities using BI-RME and Nystrom methods

    Science.gov (United States)

    Taroncher, Mariam; Vidal-Pantaleoni, Ana; Boria, Vicente E.; Marini, Stephan; Soto, Pablo; Cogollos, Santiago

    2004-04-01

    This paper describes a novel technique for the very efficient and accurate commputation of the coupling integrals of waveguide step discontinuities between arbitrary cross section waveguides. This new technique relies on solving the Integral Equation (IE) that provides the well-known Boundary Integral -- Resonant Mode Expansion (Bi-RME) method by the Nystrom approach, instead of using the traditional Galerkin version of the Method of Moments (MoM), thus providing large savings on computational costs. Comparative benchmarks between the results provided by the new technique and the original BI-RME method are successfully presented.

  16. MPL-A program for computations with iterated integrals on moduli spaces of curves of genus zero

    Science.gov (United States)

    Bogner, Christian

    2016-06-01

    We introduce the Maple program MPL for computations with multiple polylogarithms. The program is based on homotopy invariant iterated integrals on moduli spaces M0,n of curves of genus 0 with n ordered marked points. It includes the symbol map and procedures for the analytic computation of period integrals on M0,n. It supports the automated computation of a certain class of Feynman integrals.

  17. Insight into implementation of facility-based integrated management of childhood illness strategy in a rural district of Sindh, Pakistan

    Directory of Open Access Journals (Sweden)

    Nousheen Akber Pradhan

    2013-07-01

    Full Text Available Background: Integrated management of childhood illnesses (IMCI strategy has been proven to improve health outcomes in children under 5 years of age. Pakistan, despite being in the late implementation phase of the strategy, continues to report high under-five mortality due to pneumonia, diarrhea, measles, and malnutrition – the main targets of the strategy. Objective: The study determines the factors influencing IMCI implementation at public-sector primary health care (PHC facilities in Matiari district, Sindh, Pakistan. Design: An exploratory qualitative study with an embedded quantitative strand was conducted. The qualitative part included 16 in-depth interviews (IDIs with stakeholders which included planners and policy makers at a provincial level (n=5, implementers and managers at a district level (n=3, and IMCI-trained physicians posted at PHC facilities (n=8. Quantitative part included PHC facility survey (n=16 utilizing WHO health facility assessment tool to assess availability of IMCI essential drugs, supplies, and equipments. Qualitative content analysis was used to interpret the textual information, whereas descriptive frequencies were calculated for health facility survey data. Results: The major factors reported to enhance IMCI implementation were knowledge and perception about the strategy and need for separate clinic for children aged under 5 years as potential support factors. The latter can facilitate in strategy implementation through allocated workforce and required equipments and supplies. Constraint factors mainly included lack of clear understanding of the strategy, poor planning for IMCI implementation, ambiguity in defined roles and responsibilities among stakeholders, and insufficient essential supplies and drugs at PHC centers. The latter was further substantiated through health facilities’ survey findings, which indicated that none of the facilities had 100% stock of essential supplies and drugs. Only one out of all

  18. Applications integration in a hybrid cloud computing environment: modelling and platform

    Science.gov (United States)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  19. Architectonic integration of solar thermal facilities; Integracion arquitectonica de instalaciones de energia solar termica

    Energy Technology Data Exchange (ETDEWEB)

    Lopez, G.; Martinez, J.

    2004-07-01

    This article proposes a procedure to achieve an appropriate architectural integration of a solar thermal system on a building taking into account urban planning, building typology, installation and construction factors that take part in the process of integration. Sun exposure of solar collectors is the main determinant since it affects directly to the design of the skin of the building. An appropriate analysis and resolution of the incorporation of the solar system in the 4 levels is arise in order to assure a correct integration of the system in the building. Starting from urban planning determining factors, a rational process of integration on the basis of the characteristics of the system, of the building and of its interrelation should guarantee an appropriate solution. (Author)

  20. Integrating Molecular Computation and Material Production in an Artificial Subcellular Matrix

    DEFF Research Database (Denmark)

    Fellermann, Harold; Hadorn, Maik; Bönzli, Eva

    Living systems are unique in that they integrate molecular recognition and information processing with material production on the molecular scale. Pre- dominant locus of this integration is the cellular matrix, where a multitude of biochemical reactions proceed simultaneously in highly compartmen......Living systems are unique in that they integrate molecular recognition and information processing with material production on the molecular scale. Pre- dominant locus of this integration is the cellular matrix, where a multitude of biochemical reactions proceed simultaneously in highly...... compartmentalized re- action compartments that interact and get delivered through vesicle trafficking. The European Commission funded project MatchIT (Matrix for Chemical IT) aims at creating an artificial cellular matrix that seamlessly integrates infor- mation processing and material production in much the same...... way as its biological counterpart: the project employs addressable chemical containers (chemtainers) interfaced with electronic computers via mechano-electronic microfluidics....

  1. INTEGRATION OF COMPUTER TECHNOLOGIES SMK: AUTOMATION OF THE PRODUCTION CERTIFICA-TION PROCEDURE AND FORMING OF SHIPPING DOCUMENTS

    Directory of Open Access Journals (Sweden)

    S. A. Pavlenko

    2009-01-01

    Full Text Available Integration of informational computer technologies allowed to reorganize and optimize some processes due to decrease of circulation of documents, unification of documentation forms and others.

  2. A City Parking Integration System Combined with Cloud Computing Technologies and Smart Mobile Devices

    Science.gov (United States)

    Yeh, Her-Tyan; Chen, Bing-Chang; Wang, Bo-Xun

    2016-01-01

    The current study applied cloud computing technology and smart mobile devices combined with a streaming server for parking lots to plan a city parking integration system. It is also equipped with a parking search system, parking navigation system, parking reservation service, and car retrieval service. With this system, users can quickly find…

  3. An Integrated Teaching Method of Gross Anatomy and Computed Tomography Radiology

    Science.gov (United States)

    Murakami, Tohru; Tajika, Yuki; Ueno, Hitoshi; Awata, Sachiko; Hirasawa, Satoshi; Sugimoto, Maki; Kominato, Yoshihiko; Tsushima, Yoshito; Endo, Keigo; Yorifuji, Hiroshi

    2014-01-01

    It is essential for medical students to learn and comprehend human anatomy in three dimensions (3D). With this in mind, a new system was designed in order to integrate anatomical dissections with diagnostic computed tomography (CT) radiology. Cadavers were scanned by CT scanners, and students then consulted the postmortem CT images during cadaver…

  4. Addressing the 21st Century Paradox: Integrating Entrepreneurship in the Computer Information Systems Curriculum

    Science.gov (United States)

    Lang, Guido; Babb, Jeffry

    2015-01-01

    The Computer Information Systems (CIS) discipline faces an identity crisis: although demand for CIS graduates is growing, student enrollment is either in decline, or is at least soft or flat in many cases. This has been referred to as the 21st century paradox. As one solution to this problem, we propose to integrate entrepreneurship in the CIS…

  5. Integrating Computer Algebra Systems in Post-Secondary Mathematics Education: Preliminary Results of a Literature Review

    Science.gov (United States)

    Buteau, Chantal; Marshall, Neil; Jarvis, Daniel; Lavicza, Zsolt

    2010-01-01

    We present results of a literature review pilot study (326 papers) regarding the use of Computer Algebra Systems (CAS) in tertiary mathematics education. Several themes that have emerged from the review are discussed: diverse uses of CAS, benefits to student learning, issues of integration and mathematics learning, common and innovative usage of…

  6. Integration of digital dental casts in cone-beam computed tomography scans

    NARCIS (Netherlands)

    Rangel, F.A.; Maal, T.J.J.; Berge, S.J.; Kuijpers-Jagtman, A.M.

    2012-01-01

    Cone-beam computed tomography (CBCT) is widely used in maxillofacial surgery. The CBCT image of the dental arches, however, is of insufficient quality to use in digital planning of orthognathic surgery. Several authors have described methods to integrate digital dental casts into CBCT scans, but all

  7. Variables that Affect Math Teacher Candidates' Intentions to Integrate Computer-Assisted Mathematics Education (CAME)

    Science.gov (United States)

    Erdogan, Ahmet

    2010-01-01

    Based on Social Cognitive Carier Theory (SCCT) (Lent, Brown, & Hackett, 1994, 2002), this study tested the effects of mathematics teacher candidates' self-efficacy in, outcome expectations from, and interest in CAME on their intentions to integrate Computer-Assisted Mathematics Education (CAME). While mathematics teacher candidates' outcome…

  8. Measuring and Supporting Pre-Service Teachers' Self-Efficacy towards Computers, Teaching, and Technology Integration

    Science.gov (United States)

    Killi, Carita; Kauppinen, Merja; Coiro, Julie; Utriainen, Jukka

    2016-01-01

    This paper reports on two studies designed to examine pre-service teachers' self-efficacy beliefs. Study I investigated the measurement properties of a self-efficacy beliefs questionnaire comprising scales for computer self-efficacy, teacher self-efficacy, and self-efficacy towards technology integration. In Study I, 200 pre-service teachers…

  9. Pedagogical Factors Affecting Integration of Computers in Mathematics Instruction in Secondary Schools in Kenya

    Science.gov (United States)

    Wanjala, Martin M. S.; Aurah, Catherine M.; Symon, Koros C.

    2015-01-01

    The paper reports findings of a study which sought to examine the pedagogical factors that affect the integration of computers in mathematics instruction as perceived by teachers in secondary schools in Kenya. This study was based on the Technology Acceptance Model (TAM). A descriptive survey design was used for this study. Stratified and simple…

  10. Integration of Computers into the Medical School Curriculum: An Example from a Microbiology Course.

    Science.gov (United States)

    Platt, Mark W.; And Others

    1994-01-01

    While the use of computers has become widespread in recent years, a unified, integrated approach to their use in the medical school curriculum has not yet emerged. Describes a program at the University of New Mexico that will phase-in computerization of its curriculum beginning in the fall of 1993. (LZ)

  11. Computer Technology-Integrated Projects Should Not Supplant Craft Projects in Science Education

    Science.gov (United States)

    Klopp, Tabatha J.; Rule, Audrey C.; Schneider, Jean Suchsland; Boody, Robert M.

    2014-01-01

    The current emphasis on computer technology integration and narrowing of the curriculum has displaced arts and crafts. However, the hands-on, concrete nature of craft work in science modeling enables students to understand difficult concepts and to be engaged and motivated while learning spatial, logical, and sequential thinking skills. Analogy…

  12. An Integrated Teaching Method of Gross Anatomy and Computed Tomography Radiology

    Science.gov (United States)

    Murakami, Tohru; Tajika, Yuki; Ueno, Hitoshi; Awata, Sachiko; Hirasawa, Satoshi; Sugimoto, Maki; Kominato, Yoshihiko; Tsushima, Yoshito; Endo, Keigo; Yorifuji, Hiroshi

    2014-01-01

    It is essential for medical students to learn and comprehend human anatomy in three dimensions (3D). With this in mind, a new system was designed in order to integrate anatomical dissections with diagnostic computed tomography (CT) radiology. Cadavers were scanned by CT scanners, and students then consulted the postmortem CT images during cadaver…

  13. Design and implementation of an integrated computer working environment for doing mathematics and science

    NARCIS (Netherlands)

    Heck, A.; Kedzierska, E.; Ellermeijer, T.

    2009-01-01

    In this paper we report on the sustained research and development work at the AMSTEL Institute of the University of Amsterdam to improve mathematics and science education at primary and secondary school level, which has lead amongst other things to the development of the integrated computer working

  14. Computation of the radiation Q of dielectric-loaded electrically small antennas in integral equation formulations

    DEFF Research Database (Denmark)

    Kim, Oleksiy S.

    2016-01-01

    A new technique for estimating the impedance frequency bandwidth of electrically small antennas loaded with magneto-dielectric material from a single-frequency simulation in a surface integral equation solver is presented. The estimate is based on the inverse of the radiation Q computed using newly...

  15. CIM (Computer Integrated Manufacturing) in Higher Education: A Partnership with IBM.

    Science.gov (United States)

    Andrews, Hans A.; Allen, John

    1992-01-01

    Discusses Illinois Valley Community College's selection as 1 of 48 community colleges operating technology transfer and demonstration centers for computer-integrated manufacturing (CIM) in the IBM-CIM in Higher Education Alliance. Reviews local industry involvement, curriculum development, faculty training, deliverables, and the prognosis for the…

  16. Variables that Affect Math Teacher Candidates' Intentions to Integrate Computer-Assisted Mathematics Education (CAME)

    Science.gov (United States)

    Erdogan, Ahmet

    2010-01-01

    Based on Social Cognitive Carier Theory (SCCT) (Lent, Brown, & Hackett, 1994, 2002), this study tested the effects of mathematics teacher candidates' self-efficacy in, outcome expectations from, and interest in CAME on their intentions to integrate Computer-Assisted Mathematics Education (CAME). While mathematics teacher candidates' outcome…

  17. Existence and computation of equilibria of first-price auctions with integral valuations and bids

    DEFF Research Database (Denmark)

    Escamocher, Guillaume; Miltersen, Peter Bro; Santillan, Rocio

    2009-01-01

    We consider existence and computation of symmetric Pure Strategy Nash Equilibrium (PSNE) in single-item, sealed-bid, first-price auctions with integral valuations and bids. For the most general case, we show that existence of PSNE is NP-hard. Then, we present algorithmic results for the case...

  18. The Influence of Computer Games on Visual-Motor Integration in Profoundly Deaf Children

    Science.gov (United States)

    Radovanovic, Vesna

    2013-01-01

    The purpose of this research was to examine the influence of specialised software on the visual-motor integration of profoundly deaf children. The research sample was made up of 70 students aged from seven to 10, 43 of whom formed the experimental group and 27 the control group. The students in the experimental group used computers once a week…

  19. Problem-Solving Inquiry-Oriented Biology Tasks Integrating Practical Laboratory and Computer.

    Science.gov (United States)

    Friedler, Yael; And Others

    1992-01-01

    Presents results of a study that examines the development and use of computer simulations for high school science instruction and for integrated laboratory and computerized tests that are part of the biology matriculation examination in Israel. Eleven implications for teaching are presented. (MDH)

  20. Existence and computation of equilibria of first-price auctions with integral valuations and bids

    DEFF Research Database (Denmark)

    Escamocher, Guillaume; Miltersen, Peter Bro; Santillan, Rocio

    2009-01-01

    We consider existence and computation of symmetric Pure Strategy Nash Equilibrium (PSNE) in single-item, sealed-bid, first-price auctions with integral valuations and bids. For the most general case, we show that existence of PSNE is NP-hard. Then, we present algorithmic results for the case...

  1. Challenges in Integrating a Complex Systems Computer Simulation in Class: An Educational Design Research

    Science.gov (United States)

    Loke, Swee-Kin; Al-Sallami, Hesham S.; Wright, Daniel F. B.; McDonald, Jenny; Jadhav, Sheetal; Duffull, Stephen B.

    2012-01-01

    Complex systems are typically difficult for students to understand and computer simulations offer a promising way forward. However, integrating such simulations into conventional classes presents numerous challenges. Framed within an educational design research, we studied the use of an in-house built simulation of the coagulation network in four…

  2. Global optimization for integrated design and control of computationally expensive process models

    NARCIS (Netherlands)

    Egea, J.A.; Vries, D.; Alonso, A.A.; Banga, J.R.

    2007-01-01

    The problem of integrated design and control optimization of process plants is discussed in this paper. We consider it as a nonlinear programming problem subject to differential-algebraic constraints. This class of problems is frequently multimodal and "costly" (i.e., computationally expensive to ev

  3. Global optimization for integrated design and control of computationally expensive process models

    NARCIS (Netherlands)

    Egea, J.A.; Vries, D.; Alonso, A.A.; Banga, J.R.

    2007-01-01

    The problem of integrated design and control optimization of process plants is discussed in this paper. We consider it as a nonlinear programming problem subject to differential-algebraic constraints. This class of problems is frequently multimodal and "costly" (i.e., computationally expensive to ev

  4. Design and implementation of an integrated computer working environment for doing mathematics and science

    NARCIS (Netherlands)

    Heck, A.; Kedzierska, E.; Ellermeijer, T.

    2009-01-01

    In this paper we report on the sustained research and development work at the AMSTEL Institute of the University of Amsterdam to improve mathematics and science education at primary and secondary school level, which has lead amongst other things to the development of the integrated computer working

  5. A Methodology for Integrating Computer-Based Learning Tools in Science Curricula

    Science.gov (United States)

    Papadouris, Nicos; Constantinou, Constantinos P.

    2009-01-01

    This paper demonstrates a methodology for effectively integrating computer-based learning tools in science teaching and learning. This methodology provides a means of systematic analysis to identify the capabilities of particular software tools and to formulate a series of competencies relevant to physical science that could be developed by means…

  6. Health risks for the population living in the vicinity of an Integrated Waste Management Facility: Screening environmental pollutants

    Energy Technology Data Exchange (ETDEWEB)

    Domingo, José L., E-mail: joseluis.domingo@urv.cat [Laboratory of Toxicology and Environmental Health, School of Medicine, IISPV, Universitat Rovira i Virgili, Sant Llorenç 21, 43201 Reus, Catalonia (Spain); Rovira, Joaquim [Laboratory of Toxicology and Environmental Health, School of Medicine, IISPV, Universitat Rovira i Virgili, Sant Llorenç 21, 43201 Reus, Catalonia (Spain); Departament d' Enginyeria Quimica, Universitat Rovira i Virgili, Av. Països Catalans 26, 43007 Tarragona, Catalonia (Spain); Vilavert, Lolita; Nadal, Martí [Laboratory of Toxicology and Environmental Health, School of Medicine, IISPV, Universitat Rovira i Virgili, Sant Llorenç 21, 43201 Reus, Catalonia (Spain); Figueras, María J. [Microbiology Unit, School of Medicine, Universitat Rovira i Virgili, Sant Llorenç 21, 43201 Reus, Catalonia (Spain); Schuhmacher, Marta [Laboratory of Toxicology and Environmental Health, School of Medicine, IISPV, Universitat Rovira i Virgili, Sant Llorenç 21, 43201 Reus, Catalonia (Spain); Departament d' Enginyeria Quimica, Universitat Rovira i Virgili, Av. Països Catalans 26, 43007 Tarragona, Catalonia (Spain)

    2015-06-15

    We performed a screening investigation to assess the human health risks of the Integrated Waste Management Facility (IWMF: mechanical–biological treatment (MBT) plant plus municipal solid waste incinerator (MSWI); Ecoparc-3) of Barcelona (Spain). Air concentrations of pollutants potentially released by the MBT plant (VOCs and bioaerosols) and the MSWI (trace elements, PCDD/Fs and PCBs) were determined. Trace elements, PCDD/Fs and PCBs were also analyzed in soil samples. The concentrations of trace elements and bioaerosols were similar to those previously reported in other areas of similar characteristics, while formaldehyde was the predominant VOC. Interestingly, PCDD/F concentrations in soil and air were the highest ever reported near a MSWI in Catalonia, being maximum concentrations 10.8 ng WHO-TEQ/kg and 41.3 fg WHO-TEQ/m{sup 3}, respectively. In addition, there has not been any reduction in soils, even after the closure of a power plant located adjacently. Human health risks of PCDD/F exposure in the closest urban nucleus located downwind the MSWI are up to 10-times higher than those nearby other MSWIs in Catalonia. Although results must be considered as very preliminary, they are a serious warning for local authorities. We strongly recommend to conduct additional studies to confirm these findings and, if necessary, to implement measures to urgently mitigate the impact of the MSWI on the surrounding environment. We must also state the tremendous importance of an individual evaluation of MSWIs, rather than generalizing their environmental and health risks. - Highlights: • Health risks of an Integrated Waste Management Facility in Catalonia are assessed. • PCDD/F exposure near this facility is up to 10-times higher than that near others. • Environmental monitoring of incineration plants should be performed case-by-case. • Since results are very preliminary, confirmatory studies should be conducted.

  7. Investigation of TASS/SMR Capability to Predict a Natural Circulation in the Test Facility for an Integral Reactor

    Directory of Open Access Journals (Sweden)

    Young-Jong Chung

    2014-01-01

    Full Text Available System-integrated modular advanced reactor (SMART is a small-sized advanced integral type pressurized water reactor (PWR with a rated thermal power of 330 MW. It can produce 100 MW of electricity or 90 MW of electricity and 40,000 ton of desalinated water concurrently, which is sufficient for 100,000 residents. The design features contributing to safety enhancement are basically inherent safety improvement and passive safety features. TASS/SMR code was developed for an analysis of design based events and accidents in an integral type reactor reflecting the characteristics of the SMART design. The main purpose of the code is to analyze all relevant phenomena and processes. The code should be validated using experimental data in order to confirm prediction capability. TASS/SMR predicts well the overall thermal-hydraulic behavior under various natural circulation conditions at the experimental test facility for an integral reactor. A pressure loss should be provided a function of Reynolds number at low velocity conditions in order to simulate the mass flow rate well under natural circulations.

  8. Understanding principles of integration and segregation using whole-brain computational connectomics: implications for neuropsychiatric disorders.

    Science.gov (United States)

    Lord, Louis-David; Stevner, Angus B; Deco, Gustavo; Kringelbach, Morten L

    2017-06-28

    To survive in an ever-changing environment, the brain must seamlessly integrate a rich stream of incoming information into coherent internal representations that can then be used to efficiently plan for action. The brain must, however, balance its ability to integrate information from various sources with a complementary capacity to segregate information into modules which perform specialized computations in local circuits. Importantly, evidence suggests that imbalances in the brain's ability to bind together and/or segregate information over both space and time is a common feature of several neuropsychiatric disorders. Most studies have, however, until recently strictly attempted to characterize the principles of integration and segregation in static (i.e. time-invariant) representations of human brain networks, hence disregarding the complex spatio-temporal nature of these processes. In the present Review, we describe how the emerging discipline of whole-brain computational connectomics may be used to study the causal mechanisms of the integration and segregation of information on behaviourally relevant timescales. We emphasize how novel methods from network science and whole-brain computational modelling can expand beyond traditional neuroimaging paradigms and help to uncover the neurobiological determinants of the abnormal integration and segregation of information in neuropsychiatric disorders.This article is part of the themed issue 'Mathematical methods in medicine: neuroscience, cardiology and pathology'. © 2017 The Author(s).

  9. A Model of an Expert Computer Vision and Recognition Facility with Applications of a Proportion Technique.

    Science.gov (United States)

    2014-09-26

    of research is being 14 function called WHATISFACE. [Rhodes][Tucker][ Hogg ][Sowa] The model offering the most specific information about structure and...1983. Hogg , D., "Model-based vision: a program to see a walking person", from "Image and Vision Computing", Vol. 1, No. 1, February 1983, pp. 5-20...Systems", Addison-Wesley Publishing Company, Inc., Massachusetts, 1983. Hogg , D., "Model-based vision: a program to see a walking person", from "Image

  10. Teacher Perceptions of the Integration of Laptop Computers in Their High School Biology Classrooms

    Science.gov (United States)

    Gundy, Morag S.

    2011-12-01

    Studies indicate that teachers, and in particular science teachers in the senior high school grades, do not integrate laptop computers into their instruction to the extent anticipated by researchers. This technology has not spread easily to other teachers even with improved access to hardware and software, increased support, and a paradigm shift from teacher-centred to student-centred education. Although a number of studies have focused on the issues and problems related to the integration of laptops in classroom instruction, these studies, largely quantitative in nature, have tended to bypass the role teachers play in integrating laptop computers into their instruction. This thesis documents and describes the role of Ontario high school science teachers in the integration of laptop computers in the classroom. Ten teachers who have successfully integrated laptop computers into their biology courses participated in this descriptive study. Their perceptions of implementing laptops into their biology courses, key factors about the implementation process, and how the implementation was accomplished are examined. The study also identifies the conditions which they feel would allow this innovation to be implemented by other teachers. Key findings of the study indicate that teachers must initiate, implement and sustain an emergent and still evolving innovation; teacher perceptions change and continue to change with increased experience using laptops in the science classroom; changes in teaching approaches are significant as a result of the introduction of laptop technology; and, the teachers considered the acquisition and use of new teaching materials to be an important aspect of integrating laptop computers into instruction. Ongoing challenges for appropriate professional development, sharing of knowledge, skills and teaching materials are identified. The study provides a body of practical knowledge for biology teachers who are considering the integration of laptops into

  11. Direction and Integration of Experimental Ground Test Capabilities and Computational Methods

    Science.gov (United States)

    Dunn, Steven C.

    2016-01-01

    This paper groups and summarizes the salient points and findings from two AIAA conference panels targeted at defining the direction, with associated key issues and recommendations, for the integration of experimental ground testing and computational methods. Each panel session utilized rapporteurs to capture comments from both the panel members and the audience. Additionally, a virtual panel of several experts were consulted between the two sessions and their comments were also captured. The information is organized into three time-based groupings, as well as by subject area. These panel sessions were designed to provide guidance to both researchers/developers and experimental/computational service providers in defining the future of ground testing, which will be inextricably integrated with the advancement of computational tools.

  12. Integrating computation and visualization for biomolecular analysis: an example using python and AVS.

    Science.gov (United States)

    Sanner, M F; Duncan, B S; Carrillo, C J; Olson, A J

    1999-01-01

    One of the challenges in biocomputing is to enable the efficient use of a wide variety of fast-evolving computational methods to simulate, analyze, and understand the complex properties and interactions of molecular systems. Our laboratory investigates several areas including molecular visualization, protein-ligand docking, protein-protein docking, molecular surfaces, and the derivation of phenomenological potentials. In this paper we present an approach based on the Python programming language to achieve a high level of integration between these different computational methods and our primary visualization system AVS. This approach removes many limitations of AVS while increasing dramatically the inter-operability of our computational tools. Several examples are shown to illustrate how this approach enables a high level of integration and inter-operability between different tools, while retaining modularity and avoiding the creation of a large monolithic package that is difficult to extend and maintain.

  13. An Architecture for Integrated Intelligence in Urban Management using Cloud Computing

    CERN Document Server

    Khan, Zaheer; McClatchey, Richard; Anjum, Ashiq

    2012-01-01

    With the emergence of new methodologies and technologies it has now become possible to manage large amounts of environmental sensing data and apply new integrated computing models to acquire information intelligence. This paper advocates the application of cloud capacity to support the information, communication and decision making needs of a wide variety of stakeholders in the complex business of the management of urban and regional development. The complexity lies in the interactions and impacts embodied in the concept of the urban-ecosystem at various governance levels. This highlights the need for more effective integrated environmental management systems. This paper offers a user-orientated approach based on requirements for an effective management of the urban-ecosystem and the potential contributions that can be supported by the cloud computing community. Furthermore, the commonality of the influence of the drivers of change at the urban level offers the opportunity for the cloud computing community to...

  14. Comparing the influence of spectro-temporal integration in computational speech segregation

    DEFF Research Database (Denmark)

    Bentsen, Thomas; May, Tobias; Kressner, Abigail Anne;

    2016-01-01

    The goal of computational speech segregation systems is to automatically segregate a target speaker from interfering maskers. Typically, these systems include a feature extraction stage in the front-end and a classification stage in the back-end. A spectrotemporal integration strategy can...... be applied in either the frontend, using the so-called delta features, or in the back-end, using a second classifier that exploits the posterior probability of speech from the first classifier across a spectro-temporal window. This study systematically analyzes the influence of such stages on segregation...... metric that comprehensively predicts computational segregation performance and correlates well with intelligibility. The outcome of this study could help to identify the most effective spectro-temporal integration strategy for computational segregation systems....

  15. Integrated Computational Materials Engineering (ICME) for Third Generation Advanced High-Strength Steel Development

    Energy Technology Data Exchange (ETDEWEB)

    Savic, Vesna; Hector, Louis G.; Ezzat, Hesham; Sachdev, Anil K.; Quinn, James; Krupitzer, Ronald; Sun, Xin

    2015-06-01

    This paper presents an overview of a four-year project focused on development of an integrated computational materials engineering (ICME) toolset for third generation advanced high-strength steels (3GAHSS). Following a brief look at ICME as an emerging discipline within the Materials Genome Initiative, technical tasks in the ICME project will be discussed. Specific aims of the individual tasks are multi-scale, microstructure-based material model development using state-of-the-art computational and experimental techniques, forming, toolset assembly, design optimization, integration and technical cost modeling. The integrated approach is initially illustrated using a 980 grade transformation induced plasticity (TRIP) steel, subject to a two-step quenching and partitioning (Q&P) heat treatment, as an example.

  16. System Requirements Analysis for a Computer-based Procedure in a Research Reactor Facility

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jaek Wan; Jang, Gwi Sook; Seo, Sang Moon; Shin, Sung Ki [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    This can address many of the routine problems related to human error in the use of conventional, hard-copy operating procedures. An operating supporting system is also required in a research reactor. A well-made CBP can address the staffing issues of a research reactor and reduce the human errors by minimizing the operator's routine tasks. A CBP for a research reactor has not been proposed yet. Also, CBPs developed for nuclear power plants have powerful and various technical functions to cover complicated plant operation situations. However, many of the functions may not be required for a research reactor. Thus, it is not reasonable to apply the CBP to a research reactor directly. Also, customizing of the CBP is not cost-effective. Therefore, a compact CBP should be developed for a research reactor. This paper introduces high level requirements derived by the system requirements analysis activity as the first stage of system implementation. Operation support tools are under consideration for application to research reactors. In particular, as a full digitalization of the main control room, application of a computer-based procedure system has been required as a part of man-machine interface system because it makes an impact on the operating staffing and human errors of a research reactor. To establish computer-based system requirements for a research reactor, this paper addressed international standards and previous practices on nuclear plants.

  17. Facile identification of dual FLT3-Aurora A inhibitors: a computer-guided drug design approach.

    Science.gov (United States)

    Chang Hsu, Yung; Ke, Yi-Yu; Shiao, Hui-Yi; Lee, Chieh-Chien; Lin, Wen-Hsing; Chen, Chun-Hwa; Yen, Kuei-Jung; Hsu, John T-A; Chang, Chungming; Hsieh, Hsing-Pang

    2014-05-01

    Computer-guided drug design is a powerful tool for drug discovery. Herein we disclose the use of this approach for the discovery of dual FMS-like receptor tyrosine kinase-3 (FLT3)-Aurora A inhibitors against cancer. An Aurora hit compound was selected as a starting point, from which 288 virtual molecules were screened. Subsequently, some of these were synthesized and evaluated for their capacity to inhibit FLT3 and Aurora kinase A. To further enhance FLT3 inhibition, structure-activity relationship studies of the lead compound were conducted through a simplification strategy and bioisosteric replacement, followed by the use of computer-guided drug design to prioritize molecules bearing a variety of different terminal groups in terms of favorable binding energy. Selected compounds were then synthesized, and their bioactivity was evaluated. Of these, one novel inhibitor was found to exhibit excellent inhibition of FLT3 and Aurora kinase A and exert a dramatic antiproliferative effect on MOLM-13 and MV4-11 cells, with an IC50 value of 7 nM. Accordingly, it is considered a highly promising candidate for further development.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  19. Opportunities of CMOS-MEMS integration through LSI foundry and open facility

    Science.gov (United States)

    Mita, Yoshio; Lebrasseur, Eric; Okamoto, Yuki; Marty, Frédéfic; Setoguchi, Ryota; Yamada, Kentaro; Mori, Isao; Morishita, Satoshi; Imai, Yoshiaki; Hosaka, Kota; Hirakawa, Atsushi; Inoue, Shu; Kubota, Masanori; Denoual, Matthieu

    2017-06-01

    Since the 2000s, several countries have established micro- and nanofabrication platforms for the research and education community as national projects. By combining such platforms with VLSI multichip foundry services, various integrated devices, referred to as “CMOS-MEMS”, can be realized without constructing an entire cleanroom. In this paper, we summarize MEMS-last postprocess schemes for CMOS devices on a bulk silicon wafer as well as on a silicon-on-insulator (SOI) wafer using an open-access cleanroom of the Nanotechnology Platform of MEXT Japan. The integration devices presented in this article are free-standing structures and postprocess isolated LSI devices. Postprocess issues are identified with their solutions, such as the reactive ion etching (RIE) lag for dry release and the impact of the deep RIE (DRIE) postprocess on transistor characteristics. Integration with nonsilicon materials is proposed as one of the future directions.

  20. Technical Report: Toward a Scalable Algorithm to Compute High-Dimensional Integrals of Arbitrary Functions

    Energy Technology Data Exchange (ETDEWEB)

    Snyder, Abigail C. [University of Pittsburgh; Jiao, Yu [ORNL

    2010-10-01

    Neutron experiments at the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory (ORNL) frequently generate large amounts of data (on the order of 106-1012 data points). Hence, traditional data analysis tools run on a single CPU take too long to be practical and scientists are unable to efficiently analyze all data generated by experiments. Our goal is to develop a scalable algorithm to efficiently compute high-dimensional integrals of arbitrary functions. This algorithm can then be used to integrate the four-dimensional integrals that arise as part of modeling intensity from the experiments at the SNS. Here, three different one-dimensional numerical integration solvers from the GNU Scientific Library were modified and implemented to solve four-dimensional integrals. The results of these solvers on a final integrand provided by scientists at the SNS can be compared to the results of other methods, such as quasi-Monte Carlo methods, computing the same integral. A parallelized version of the most efficient method can allow scientists the opportunity to more effectively analyze all experimental data.