WorldWideScience

Sample records for computing resource center

  1. A multipurpose computing center with distributed resources

    Science.gov (United States)

    Chudoba, J.; Adam, M.; Adamová, D.; Kouba, T.; Mikula, A.; Říkal, V.; Švec, J.; Uhlířová, J.; Vokáč, P.; Svatoš, M.

    2017-10-01

    The Computing Center of the Institute of Physics (CC IoP) of the Czech Academy of Sciences serves a broad spectrum of users with various computing needs. It runs WLCG Tier-2 center for the ALICE and the ATLAS experiments; the same group of services is used by astroparticle physics projects the Pierre Auger Observatory (PAO) and the Cherenkov Telescope Array (CTA). OSG stack is installed for the NOvA experiment. Other groups of users use directly local batch system. Storage capacity is distributed to several locations. DPM servers used by the ATLAS and the PAO are all in the same server room, but several xrootd servers for the ALICE experiment are operated in the Nuclear Physics Institute in Řež, about 10 km away. The storage capacity for the ATLAS and the PAO is extended by resources of the CESNET - the Czech National Grid Initiative representative. Those resources are in Plzen and Jihlava, more than 100 km away from the CC IoP. Both distant sites use a hierarchical storage solution based on disks and tapes. They installed one common dCache instance, which is published in the CC IoP BDII. ATLAS users can use these resources using the standard ATLAS tools in the same way as the local storage without noticing this geographical distribution. Computing clusters LUNA and EXMAG dedicated to users mostly from the Solid State Physics departments offer resources for parallel computing. They are part of the Czech NGI infrastructure MetaCentrum with distributed batch system based on torque with a custom scheduler. Clusters are installed remotely by the MetaCentrum team and a local contact helps only when needed. Users from IoP have exclusive access only to a part of these two clusters and take advantage of higher priorities on the rest (1500 cores in total), which can also be used by any user of the MetaCentrum. IoP researchers can also use distant resources located in several towns of the Czech Republic with a capacity of more than 12000 cores in total.

  2. NASA Center for Computational Sciences: History and Resources

    Science.gov (United States)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  3. Argonne Laboratory Computing Resource Center - FY2004 Report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  4. ATLAS Tier-2 at the Compute Resource Center GoeGrid in Göttingen

    Science.gov (United States)

    Meyer, Jörg; Quadt, Arnulf; Weber, Pavel; ATLAS Collaboration

    2011-12-01

    GoeGrid is a grid resource center located in Göttingen, Germany. The resources are commonly used, funded, and maintained by communities doing research in the fields of grid development, computer science, biomedicine, high energy physics, theoretical physics, astrophysics, and the humanities. For the high energy physics community, GoeGrid serves as a Tier-2 center for the ATLAS experiment as part of the world-wide LHC computing grid (WLCG). The status and performance of the Tier-2 center is presented with a focus on the interdisciplinary setup and administration of the cluster. Given the various requirements of the different communities on the hardware and software setup the challenge of the common operation of the cluster is detailed. The benefits are an efficient use of computer and personpower resources.

  5. On-demand provisioning of HEP compute resources on cloud sites and shared HPC centers

    Science.gov (United States)

    Erli, G.; Fischer, F.; Fleig, G.; Giffels, M.; Hauth, T.; Quast, G.; Schnepf, M.; Heese, J.; Leppert, K.; Arnaez de Pedro, J.; Sträter, R.

    2017-10-01

    This contribution reports on solutions, experiences and recent developments with the dynamic, on-demand provisioning of remote computing resources for analysis and simulation workflows. Local resources of a physics institute are extended by private and commercial cloud sites, ranging from the inclusion of desktop clusters over institute clusters to HPC centers. Rather than relying on dedicated HEP computing centers, it is nowadays more reasonable and flexible to utilize remote computing capacity via virtualization techniques or container concepts. We report on recent experience from incorporating a remote HPC center (NEMO Cluster, Freiburg University) and resources dynamically requested from the commercial provider 1&1 Internet SE into our intitute’s computing infrastructure. The Freiburg HPC resources are requested via the standard batch system, allowing HPC and HEP applications to be executed simultaneously, such that regular batch jobs run side by side to virtual machines managed via OpenStack [1]. For the inclusion of the 1&1 commercial resources, a Python API and SDK as well as the possibility to upload images were available. Large scale tests prove the capability to serve the scientific use case in the European 1&1 datacenters. The described environment at the Institute of Experimental Nuclear Physics (IEKP) at KIT serves the needs of researchers participating in the CMS and Belle II experiments. In total, resources exceeding half a million CPU hours have been provided by remote sites.

  6. Argonne's Laboratory Computing Resource Center 2009 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B. (CLS-CI)

    2011-05-13

    Now in its seventh year of operation, the Laboratory Computing Resource Center (LCRC) continues to be an integral component of science and engineering research at Argonne, supporting a diverse portfolio of projects for the U.S. Department of Energy and other sponsors. The LCRC's ongoing mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting high-performance computing application use and development. This report describes scientific activities carried out with LCRC resources in 2009 and the broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. The LCRC Allocations Committee makes decisions on individual project allocations for Jazz. Committee members are appointed by the Associate Laboratory Directors and span a range of computational disciplines. The 350-node LCRC cluster, Jazz, began production service in April 2003 and has been a research work horse ever since. Hosting a wealth of software tools and applications and achieving high availability year after year, researchers can count on Jazz to achieve project milestones and enable breakthroughs. Over the years, many projects have achieved results that would have been unobtainable without such a computing resource. In fiscal year 2009, there were 49 active projects representing a wide cross-section of Laboratory research and almost all research divisions.

  7. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    Argonne National Laboratory founded the Laboratory Computing Resource Center in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. The first goal of the LCRC was to deploy a mid-range supercomputing facility to support the unmet computational needs of the Laboratory. To this end, in September 2002, the Laboratory purchased a 350-node computing cluster from Linux NetworX. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the fifty fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2005, there were 62 active projects on Jazz involving over 320 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to improve the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure

  8. Argonne's Laboratory computing resource center : 2006 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff

  9. The Radiation Safety Information Computational Center (RSICC): A Resource for Nuclear Science Applications

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, Bernadette Lugue [ORNL

    2009-01-01

    The Radiation Safety Information Computational Center (RSICC) has been in existence since 1963. RSICC collects, organizes, evaluates and disseminates technical information (software and nuclear data) involving the transport of neutral and charged particle radiation, and shielding and protection from the radiation associated with: nuclear weapons and materials, fission and fusion reactors, outer space, accelerators, medical facilities, and nuclear waste management. RSICC serves over 12,000 scientists and engineers from about 100 countries.

  10. - Oklahoma Water Resources Center

    Science.gov (United States)

    Development Ag Business Community & Rural Development Crops Family & Consumer Sciences Gardening Family & Consumer Sciences Food & Ag Products Center Horticulture & Landscape Architecture & Landscape Architecture Natural Resource Ecology & Management Plant & Soil Sciences

  11. The Radiation Safety Information Computational Center (RSICC): A Resource for Nuclear Science Applications

    International Nuclear Information System (INIS)

    Kirk, Bernadette Lugue

    2009-01-01

    The Radiation Safety Information Computational Center (RSICC) has been in existence since 1963. RSICC collects, organizes, evaluates and disseminates technical information (software and nuclear data) involving the transport of neutral and charged particle radiation, and shielding and protection from the radiation associated with: nuclear weapons and materials, fission and fusion reactors, outer space, accelerators, medical facilities, and nuclear waste management. RSICC serves over 12,000 scientists and engineers from about 100 countries. An important activity of RSICC is its participation in international efforts on computational and experimental benchmarks. An example is the Shielding Integral Benchmarks Archival Database (SINBAD), which includes shielding benchmarks for fission, fusion and accelerators. RSICC is funded by the United States Department of Energy, Department of Homeland Security and Nuclear Regulatory Commission.

  12. Water Resources Research Center

    Science.gov (United States)

    Untitled Document  Search Welcome to the University of Hawai'i at Manoa Water Resources Research Center At WRRC we concentrate on addressing the unique water and wastewater management problems and issues elsewhere by researching water-related issues distinctive to these areas. We are Hawaii's link in a network

  13. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  14. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  15. Energy-Efficient Management of Data Center Resources for Cloud Computing: A Vision, Architectural Elements, and Open Challenges

    OpenAIRE

    Buyya, Rajkumar; Beloglazov, Anton; Abawajy, Jemal

    2010-01-01

    Cloud computing is offering utility-oriented IT services to users worldwide. Based on a pay-as-you-go model, it enables hosting of pervasive applications from consumer, scientific, and business domains. However, data centers hosting Cloud applications consume huge amounts of energy, contributing to high operational costs and carbon footprints to the environment. Therefore, we need Green Cloud computing solutions that can not only save energy for the environment but also reduce operational cos...

  16. ENERGY RESOURCES CENTER

    Energy Technology Data Exchange (ETDEWEB)

    Sternberg, Virginia

    1979-11-01

    First I will give a short history of this Center which has had three names and three moves (and one more in the offing) in three years. Then I will tell you about the accomplishments made in the past year. And last, I will discuss what has been learned and what is planned for the future. The Energy and Environment Information Center (EEIC), as it was first known, was organized in August 1975 in San Francisco as a cooperative venture by the Federal Energy Administration (FEA), Energy Research and Development Administration (ERDA) and the Environmental Protection Agency (EPA). These three agencies planned this effort to assist the public in obtaining information about energy and the environmental aspects of energy. The Public Affairs Offices of FEA, ERDA and EPA initiated the idea of the Center. One member from each agency worked at the Center, with assistance from the Lawrence Berkeley Laboratory Information Research Group (LBL IRG) and with on-site help from the EPA Library. The Center was set up in a corner of the EPA Library. FEA and ERDA each contributed one staff member on a rotating basis to cover the daily operation of the Center and money for books and periodicals. EPA contributed space, staff time for ordering, processing and indexing publications, and additional money for acquisitions. The LBL Information Research Group received funds from ERDA on a 189 FY 1976 research project to assist in the development of the Center as a model for future energy centers.

  17. Framework Resources Multiply Computing Power

    Science.gov (United States)

    2010-01-01

    As an early proponent of grid computing, Ames Research Center awarded Small Business Innovation Research (SBIR) funding to 3DGeo Development Inc., of Santa Clara, California, (now FusionGeo Inc., of The Woodlands, Texas) to demonstrate a virtual computer environment that linked geographically dispersed computer systems over the Internet to help solve large computational problems. By adding to an existing product, FusionGeo enabled access to resources for calculation- or data-intensive applications whenever and wherever they were needed. Commercially available as Accelerated Imaging and Modeling, the product is used by oil companies and seismic service companies, which require large processing and data storage capacities.

  18. COMPUTATIONAL SCIENCE CENTER

    International Nuclear Information System (INIS)

    DAVENPORT, J.

    2006-01-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  19. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to

  20. Enabling opportunistic resources for CMS Computing Operations

    Energy Technology Data Exchange (ETDEWEB)

    Hufnagel, Dick [Fermilab

    2015-11-19

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize “opportunistic” resourcesresources not owned by, or a priori configured for CMS — to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enable access and bring the CMS environment into these non CMS resources. Here we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.

  1. Exploitation of heterogeneous resources for ATLAS Computing

    CERN Document Server

    Chudoba, Jiri; The ATLAS collaboration

    2018-01-01

    LHC experiments require significant computational resources for Monte Carlo simulations and real data processing and the ATLAS experiment is not an exception. In 2017, ATLAS exploited steadily almost 3M HS06 units, which corresponds to about 300 000 standard CPU cores. The total disk and tape capacity managed by the Rucio data management system exceeded 350 PB. Resources are provided mostly by Grid computing centers distributed in geographically separated locations and connected by the Grid middleware. The ATLAS collaboration developed several systems to manage computational jobs, data files and network transfers. ATLAS solutions for job and data management (PanDA and Rucio) were generalized and now are used also by other collaborations. More components are needed to include new resources such as private and public clouds, volunteers' desktop computers and primarily supercomputers in major HPC centers. Workflows and data flows significantly differ for these less traditional resources and extensive software re...

  2. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  3. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  4. Aggregated Computational Toxicology Resource (ACTOR)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Aggregated Computational Toxicology Resource (ACTOR) is a database on environmental chemicals that is searchable by chemical name and other identifiers, and by...

  5. Aggregated Computational Toxicology Online Resource

    Data.gov (United States)

    U.S. Environmental Protection Agency — Aggregated Computational Toxicology Online Resource (AcTOR) is EPA's online aggregator of all the public sources of chemical toxicity data. ACToR aggregates data...

  6. National Sexual Violence Resource Center (NSVRC)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The National Sexual Violence Resource Center (NSVRC) is a national information and resource hub relating to all aspects of sexual violence. NSVRC staff collect and...

  7. Center for Advanced Computational Technology

    Science.gov (United States)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  8. Computer Resources | College of Engineering & Applied Science

    Science.gov (United States)

    Engineering Concentration on Ergonomics M.S. Program in Computer Science Interdisciplinary Concentration on Structural Engineering Laboratory Water Resources Laboratory Computer Science Department Computer Science Academic Programs Computer Science Undergraduate Programs Computer Science Major Computer Science Tracks

  9. A Dynamic and Interactive Monitoring System of Data Center Resources

    Directory of Open Access Journals (Sweden)

    Yu Ling-Fei

    2016-01-01

    Full Text Available To maximize the utilization and effectiveness of resources, it is very necessary to have a well suited management system for modern data centers. Traditional approaches to resource provisioning and service requests have proven to be ill suited for virtualization and cloud computing. The manual handoffs between technology teams were also highly inefficient and poorly documented. In this paper, a dynamic and interactive monitoring system for data center resources, ResourceView, is presented. By consolidating all data center management functionality into a single interface, ResourceView shares a common view of the timeline metric status, while providing comprehensive, centralized monitoring of data center physical and virtual IT assets including power, cooling, physical space and VMs, so that to improve availability and efficiency. In addition, servers and VMs can be monitored from several viewpoints such as clusters, racks and projects, which is very convenient for users.

  10. SOCR: Statistics Online Computational Resource

    Directory of Open Access Journals (Sweden)

    Ivo D. Dinov

    2006-10-01

    Full Text Available The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR. This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning.

  11. ACToR - Aggregated Computational Toxicology Resource

    International Nuclear Information System (INIS)

    Judson, Richard; Richard, Ann; Dix, David; Houck, Keith; Elloumi, Fathi; Martin, Matthew; Cathey, Tommy; Transue, Thomas R.; Spencer, Richard; Wolf, Maritja

    2008-01-01

    ACToR (Aggregated Computational Toxicology Resource) is a database and set of software applications that bring into one central location many types and sources of data on environmental chemicals. Currently, the ACToR chemical database contains information on chemical structure, in vitro bioassays and in vivo toxicology assays derived from more than 150 sources including the U.S. Environmental Protection Agency (EPA), Centers for Disease Control (CDC), U.S. Food and Drug Administration (FDA), National Institutes of Health (NIH), state agencies, corresponding government agencies in Canada, Europe and Japan, universities, the World Health Organization (WHO) and non-governmental organizations (NGOs). At the EPA National Center for Computational Toxicology, ACToR helps manage large data sets being used in a high-throughput environmental chemical screening and prioritization program called ToxCast TM

  12. A Reading Resource Center: Why and How

    Science.gov (United States)

    Minkoff, Henry

    1974-01-01

    Hunter College has set up a Reading Resource Center where students receive individualized help in specific problem areas not covered in their reading classes and where teachers can find materials either for their own edification or for use in the classroom. (Author)

  13. Self-Access Centers: Maximizing Learners’ Access to Center Resources

    Directory of Open Access Journals (Sweden)

    Mark W. Tanner

    2010-09-01

    Full Text Available Originally published in TESL-EJ March 2009, Volume 12, Number 4 (http://tesl-ej.org/ej48/a2.html. Reprinted with permission from the authors.Although some students have discovered how to use self-access centers effectively, the majority appear to be unaware of available resources. A website and database of materials were created to help students locate materials and use the Self-Access Study Center (SASC at Brigham Young University’s English Language Center (ELC more effectively. Students took two surveys regarding their use of the SASC. The first survey was given before the website and database were made available. A second survey was administered 12 weeks after students had been introduced to the resource. An analysis of the data shows that students tend to use SASC resources more autonomously as a result of having a web-based database. The survey results suggest that SAC managers can encourage more autonomous use of center materials by provided a website and database to help students find appropriate materials to use to learn English.

  14. Illinois trauma centers and community violence resources

    Directory of Open Access Journals (Sweden)

    Bennet Butler

    2014-01-01

    Full Text Available Background: Elder abuse and neglect (EAN, intimate partner violence (IPV, and street-based community violence (SBCV are significant public health problems, which frequently lead to traumatic injury. Trauma centers can provide an effective setting for intervention and referral, potentially interrupting the cycle of violence. Aims: To assess existing institutional resources for the identification and treatment of violence victims among patients presenting with acute injury to statewide trauma centers. Settings and Design: We used a prospective, web-based survey of trauma medical directors at 62 Illinois trauma centers. Nonresponders were contacted via telephone to complete the survey. Materials and Methods: This survey was based on a survey conducted in 2004 assessing trauma centers and IPV resources. We modified this survey to collect data on IPV, EAN, and SBCV. Statistical Analysis: Univariate and bivariate statistics were performed using STATA statistical software. Results: We found that 100% of trauma centers now screen for IPV, an improvement from 2004 (P = 0.007. Screening for EAN (70% and SBCV (61% was less common (P < 0.001, and hospitals thought that resources for SBCV in particular were inadequate (P < 0.001 and fewer resources were available for these patients (P = 0.02. However, there was lack of uniformity of screening, tracking, and referral practices for victims of violence throughout the state. Conclusion: The multiplicity of strategies for tracking and referring victims of violence in Illinois makes it difficult to assess screening and tracking or form generalized policy recommendations. This presents an opportunity to improve care delivered to victims of violence by standardizing care and referral protocols.

  15. A Computer Learning Center for Environmental Sciences

    Science.gov (United States)

    Mustard, John F.

    2000-01-01

    In the fall of 1998, MacMillan Hall opened at Brown University to students. In MacMillan Hall was the new Computer Learning Center, since named the EarthLab which was outfitted with high-end workstations and peripherals primarily focused on the use of remotely sensed and other spatial data in the environmental sciences. The NASA grant we received as part of the "Centers of Excellence in Applications of Remote Sensing to Regional and Global Integrated Environmental Assessments" was the primary source of funds to outfit this learning and research center. Since opening, we have expanded the range of learning and research opportunities and integrated a cross-campus network of disciplines who have come together to learn and use spatial data of all kinds. The EarthLab also forms a core of undergraduate, graduate, and faculty research on environmental problems that draw upon the unique perspective of remotely sensed data. Over the last two years, the Earthlab has been a center for research on the environmental impact of water resource use in and regions, impact of the green revolution on forest cover in India, the design of forest preserves in Vietnam, and detailed assessments of the utility of thermal and hyperspectral data for water quality analysis. It has also been used extensively for local environmental activities, in particular studies on the impact of lead on the health of urban children in Rhode Island. Finally, the EarthLab has also served as a key educational and analysis center for activities related to the Brown University Affiliated Research Center that is devoted to transferring university research to the private sector.

  16. Resource Management in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Andrei IONESCU

    2015-01-01

    Full Text Available Mobile cloud computing is a major research topic in Information Technology & Communications. It integrates cloud computing, mobile computing and wireless networks. While mainly built on cloud computing, it has to operate using more heterogeneous resources with implications on how these resources are managed and used. Managing the resources of a mobile cloud is not a trivial task, involving vastly different architectures. The process is outside the scope of human users. Using the resources by the applications at both platform and software tiers come with its own challenges. This paper presents different approaches in use for managing cloud resources at infrastructure and platform levels.

  17. Statistics Online Computational Resource for Education

    Science.gov (United States)

    Dinov, Ivo D.; Christou, Nicolas

    2009-01-01

    The Statistics Online Computational Resource (http://www.SOCR.ucla.edu) provides one of the largest collections of free Internet-based resources for probability and statistics education. SOCR develops, validates and disseminates two core types of materials--instructional resources and computational libraries. (Contains 2 figures.)

  18. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    Energy Technology Data Exchange (ETDEWEB)

    Hules, J. [ed.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  19. Resource allocation in grid computing

    NARCIS (Netherlands)

    Koole, Ger; Righter, Rhonda

    2007-01-01

    Grid computing, in which a network of computers is integrated to create a very fast virtual computer, is becoming ever more prevalent. Examples include the TeraGrid and Planet-lab.org, as well as applications on the existing Internet that take advantage of unused computing and storage capacity of

  20. Center for computer security: Computer Security Group conference. Summary

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-06-01

    Topics covered include: computer security management; detection and prevention of computer misuse; certification and accreditation; protection of computer security, perspective from a program office; risk analysis; secure accreditation systems; data base security; implementing R and D; key notarization system; DOD computer security center; the Sandia experience; inspector general's report; and backup and contingency planning. (GHT)

  1. Efficient Resource Management in Cloud Computing

    OpenAIRE

    Rushikesh Shingade; Amit Patil; Shivam Suryawanshi; M. Venkatesan

    2015-01-01

    Cloud computing, one of the widely used technology to provide cloud services for users who are charged for receiving services. In the aspect of a maximum number of resources, evaluating the performance of Cloud resource management policies are difficult to optimize efficiently. There are different simulation toolkits available for simulation and modelling the Cloud computing environment like GridSim CloudAnalyst, CloudSim, GreenCloud, CloudAuction etc. In proposed Efficient Resource Manage...

  2. Building a High Performance Computing Infrastructure for Novosibirsk Scientific Center

    International Nuclear Information System (INIS)

    Adakin, A; Chubarov, D; Nikultsev, V; Belov, S; Kaplin, V; Sukharev, A; Zaytsev, A; Kalyuzhny, V; Kuchin, N; Lomakin, S

    2011-01-01

    Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies (ICT), and Institute of Computational Mathematics and Mathematical Geophysics (ICM and MG). Since each institute has specific requirements on the architecture of the computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for the particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM and MG), and a Grid Computing Facility of BINP. Recently a dedicated optical network with the initial bandwidth of 10 Gbps connecting these three facilities was built in order to make it possible to share the computing resources among the research communities of participating institutes, thus providing a common platform for building the computing infrastructure for various scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technologies based on XEN and KVM platforms. The solution implemented was tested thoroughly within the computing environment of KEDR detector experiment which is being carried out at BINP, and foreseen to be applied to the use cases of other HEP experiments in the upcoming future.

  3. Argonne's Laboratory computing center - 2007 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.; Pieper, G. W.

    2008-05-28

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (1012 floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2007, there were over 60 active projects representing a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific

  4. Engineering computations at the national magnetic fusion energy computer center

    International Nuclear Information System (INIS)

    Murty, S.

    1983-01-01

    The National Magnetic Fusion Energy Computer Center (NMFECC) was established by the U.S. Department of Energy's Division of Magnetic Fusion Energy (MFE). The NMFECC headquarters is located at Lawrence Livermore National Laboratory. Its purpose is to apply large-scale computational technology and computing techniques to the problems of controlled thermonuclear research. In addition to providing cost effective computing services, the NMFECC also maintains a large collection of computer codes in mathematics, physics, and engineering that is shared by the entire MFE research community. This review provides a broad perspective of the NMFECC, and a list of available codes at the NMFECC for engineering computations is given

  5. Activity report of Computing Research Center

    Energy Technology Data Exchange (ETDEWEB)

    1997-07-01

    On April 1997, National Laboratory for High Energy Physics (KEK), Institute of Nuclear Study, University of Tokyo (INS), and Meson Science Laboratory, Faculty of Science, University of Tokyo began to work newly as High Energy Accelerator Research Organization after reconstructing and converting their systems, under aiming at further development of a wide field of accelerator science using a high energy accelerator. In this Research Organization, Applied Research Laboratory is composed of four Centers to execute assistance of research actions common to one of the Research Organization and their relating research and development (R and D) by integrating the present four centers and their relating sections in Tanashi. What is expected for the assistance of research actions is not only its general assistance but also its preparation and R and D of a system required for promotion and future plan of the research. Computer technology is essential to development of the research and can communize for various researches in the Research Organization. On response to such expectation, new Computing Research Center is required for promoting its duty by coworking and cooperating with every researchers at a range from R and D on data analysis of various experiments to computation physics acting under driving powerful computer capacity such as supercomputer and so forth. Here were described on report of works and present state of Data Processing Center of KEK at the first chapter and of the computer room of INS at the second chapter and on future problems for the Computing Research Center. (G.K.)

  6. SOCR: Statistics Online Computational Resource

    OpenAIRE

    Dinov, Ivo D.

    2006-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis...

  7. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    Science.gov (United States)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  8. Virtualized cloud data center networks issues in resource management

    CERN Document Server

    Tsai, Linjiun

    2016-01-01

    This book discusses the characteristics of virtualized cloud networking, identifies the requirements of cloud network management, and illustrates the challenges in deploying virtual clusters in multi-tenant cloud data centers. The book also introduces network partitioning techniques to provide contention-free allocation, topology-invariant reallocation, and highly efficient resource utilization, based on the Fat-tree network structure. Managing cloud data center resources without considering resource contentions among different cloud services and dynamic resource demands adversely affects the performance of cloud services and reduces the resource utilization of cloud data centers. These challenges are mainly due to strict cluster topology requirements, resource contentions between uncooperative cloud services, and spatial/temporal data center resource fragmentation. Cloud data center network resource allocation/reallocation which cope well with such challenges will allow cloud services to be provisioned with ...

  9. Digital optical computers at the optoelectronic computing systems center

    Science.gov (United States)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  10. AHPCRC - Army High Performance Computing Research Center

    Science.gov (United States)

    2010-01-01

    computing. Of particular interest is the ability of a distrib- uted jamming network (DJN) to jam signals in all or part of a sensor or communications net...and reasoning, assistive technologies. FRIEDRICH (FRITZ) PRINZ Finmeccanica Professor of Engineering, Robert Bosch Chair, Department of Engineering...High Performance Computing Research Center www.ahpcrc.org BARBARA BRYAN AHPCRC Research and Outreach Manager, HPTi (650) 604-3732 bbryan@hpti.com Ms

  11. Animal Resource Program | Center for Cancer Research

    Science.gov (United States)

    CCR Animal Resource Program The CCR Animal Resource Program plans, develops, and coordinates laboratory animal resources for CCR’s research programs. We also provide training, imaging, and technology development in support of moving basic discoveries to the clinic. The ARP Manager:

  12. Animal Resource Program | Center for Cancer Research

    Science.gov (United States)

    CCR Animal Resource Program The CCR Animal Resource Program plans, develops, and coordinates laboratory animal resources for CCR’s research programs. We also provide training, imaging, and technology development in support of moving basic discoveries to the clinic. The ARP Office:

  13. Turning Video Resource Management into Cloud Computing

    Directory of Open Access Journals (Sweden)

    Weili Kou

    2016-07-01

    Full Text Available Big data makes cloud computing more and more popular in various fields. Video resources are very useful and important to education, security monitoring, and so on. However, issues of their huge volumes, complex data types, inefficient processing performance, weak security, and long times for loading pose challenges in video resource management. The Hadoop Distributed File System (HDFS is an open-source framework, which can provide cloud-based platforms and presents an opportunity for solving these problems. This paper presents video resource management architecture based on HDFS to provide a uniform framework and a five-layer model for standardizing the current various algorithms and applications. The architecture, basic model, and key algorithms are designed for turning video resources into a cloud computing environment. The design was tested by establishing a simulation system prototype.

  14. Framework of Resource Management for Intercloud Computing

    Directory of Open Access Journals (Sweden)

    Mohammad Aazam

    2014-01-01

    Full Text Available There has been a very rapid increase in digital media content, due to which media cloud is gaining importance. Cloud computing paradigm provides management of resources and helps create extended portfolio of services. Through cloud computing, not only are services managed more efficiently, but also service discovery is made possible. To handle rapid increase in the content, media cloud plays a very vital role. But it is not possible for standalone clouds to handle everything with the increasing user demands. For scalability and better service provisioning, at times, clouds have to communicate with other clouds and share their resources. This scenario is called Intercloud computing or cloud federation. The study on Intercloud computing is still in its start. Resource management is one of the key concerns to be addressed in Intercloud computing. Already done studies discuss this issue only in a trivial and simplistic way. In this study, we present a resource management model, keeping in view different types of services, different customer types, customer characteristic, pricing, and refunding. The presented framework was implemented using Java and NetBeans 8.0 and evaluated using CloudSim 3.0.3 toolkit. Presented results and their discussion validate our model and its efficiency.

  15. 76 FR 53885 - Patent and Trademark Resource Centers Metrics

    Science.gov (United States)

    2011-08-30

    ... DEPARTMENT OF COMMERCE United States Patent and Trademark Office Patent and Trademark Resource Centers Metrics ACTION: Proposed collection; comment request. SUMMARY: The United States Patent and... ``Patent and Trademark Resource Centers Metrics comment'' in the subject line of the message. Mail: Susan K...

  16. Dynamic integration of remote cloud resources into local computing clusters

    Energy Technology Data Exchange (ETDEWEB)

    Fleig, Georg; Erli, Guenther; Giffels, Manuel; Hauth, Thomas; Quast, Guenter; Schnepf, Matthias [Institut fuer Experimentelle Kernphysik, Karlsruher Institut fuer Technologie (Germany)

    2016-07-01

    In modern high-energy physics (HEP) experiments enormous amounts of data are analyzed and simulated. Traditionally dedicated HEP computing centers are built or extended to meet this steadily increasing demand for computing resources. Nowadays it is more reasonable and more flexible to utilize computing power at remote data centers providing regular cloud services to users as they can be operated in a more efficient manner. This approach uses virtualization and allows the HEP community to run virtual machines containing a dedicated operating system and transparent access to the required software stack on almost any cloud site. The dynamic management of virtual machines depending on the demand for computing power is essential for cost efficient operation and sharing of resources with other communities. For this purpose the EKP developed the on-demand cloud manager ROCED for dynamic instantiation and integration of virtualized worker nodes into the institute's computing cluster. This contribution will report on the concept of our cloud manager and the implementation utilizing a remote OpenStack cloud site and a shared HPC center (bwForCluster located in Freiburg).

  17. Optimised resource construction for verifiable quantum computation

    International Nuclear Information System (INIS)

    Kashefi, Elham; Wallden, Petros

    2017-01-01

    Recent developments have brought the possibility of achieving scalable quantum networks and quantum devices closer. From the computational point of view these emerging technologies become relevant when they are no longer classically simulatable. Hence a pressing challenge is the construction of practical methods to verify the correctness of the outcome produced by universal or non-universal quantum devices. A promising approach that has been extensively explored is the scheme of verification via encryption through blind quantum computation. We present here a new construction that simplifies the required resources for any such verifiable protocol. We obtain an overhead that is linear in the size of the input (computation), while the security parameter remains independent of the size of the computation and can be made exponentially small (with a small extra cost). Furthermore our construction is generic and could be applied to any universal or non-universal scheme with a given underlying graph. (paper)

  18. VECTR: Virtual Environment Computational Training Resource

    Science.gov (United States)

    Little, William L.

    2018-01-01

    The Westridge Middle School Curriculum and Community Night is an annual event designed to introduce students and parents to potential employers in the Central Florida area. NASA participated in the event in 2017, and has been asked to come back for the 2018 event on January 25. We will be demonstrating our Microsoft Hololens Virtual Rovers project, and the Virtual Environment Computational Training Resource (VECTR) virtual reality tool.

  19. LHCb Computing Resource usage in 2017

    CERN Document Server

    Bozzi, Concezio

    2018-01-01

    This document reports the usage of computing resources by the LHCb collaboration during the period January 1st – December 31st 2017. The data in the following sections have been compiled from the EGI Accounting portal: https://accounting.egi.eu. For LHCb specific information, the data is taken from the DIRAC Accounting at the LHCb DIRAC Web portal: http://lhcb-portal-dirac.cern.ch.

  20. Function Package for Computing Quantum Resource Measures

    Science.gov (United States)

    Huang, Zhiming

    2018-05-01

    In this paper, we present a function package for to calculate quantum resource measures and dynamics of open systems. Our package includes common operators and operator lists, frequently-used functions for computing quantum entanglement, quantum correlation, quantum coherence, quantum Fisher information and dynamics in noisy environments. We briefly explain the functions of the package and illustrate how to use the package with several typical examples. We expect that this package is a useful tool for future research and education.

  1. The Computational Physics Program of the national MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1989-01-01

    Since June 1974, the MFE Computer Center has been engaged in a significant computational physics effort. The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generations of supercomputers. The Computational Physics Group has been involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to tokamaks and compact toroids. A third area is the investigation of kinetic instabilities using a 3-D particle code; this work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence have been under examination, with the hope of being able to explain anomalous transport. Also, we are collaborating in an international effort to evaluate fully three-dimensional linear stability of toroidal devices. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers. A summary of these programs are included in this paper. 6 tabs

  2. Exploiting volatile opportunistic computing resources with Lobster

    Science.gov (United States)

    Woodard, Anna; Wolf, Matthias; Mueller, Charles; Tovar, Ben; Donnelly, Patrick; Hurtado Anampa, Kenyi; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2015-12-01

    Analysis of high energy physics experiments using the Compact Muon Solenoid (CMS) at the Large Hadron Collider (LHC) can be limited by availability of computing resources. As a joint effort involving computer scientists and CMS physicists at Notre Dame, we have developed an opportunistic workflow management tool, Lobster, to harvest available cycles from university campus computing pools. Lobster consists of a management server, file server, and worker processes which can be submitted to any available computing resource without requiring root access. Lobster makes use of the Work Queue system to perform task management, while the CMS specific software environment is provided via CVMFS and Parrot. Data is handled via Chirp and Hadoop for local data storage and XrootD for access to the CMS wide-area data federation. An extensive set of monitoring and diagnostic tools have been developed to facilitate system optimisation. We have tested Lobster using the 20 000-core cluster at Notre Dame, achieving approximately 8-10k tasks running simultaneously, sustaining approximately 9 Gbit/s of input data and 340 Mbit/s of output data.

  3. Parallel visualization on leadership computing resources

    Energy Technology Data Exchange (ETDEWEB)

    Peterka, T; Ross, R B [Mathematics and Computer Science Division, Argonne National Laboratory, Argonne, IL 60439 (United States); Shen, H-W [Department of Computer Science and Engineering, Ohio State University, Columbus, OH 43210 (United States); Ma, K-L [Department of Computer Science, University of California at Davis, Davis, CA 95616 (United States); Kendall, W [Department of Electrical Engineering and Computer Science, University of Tennessee at Knoxville, Knoxville, TN 37996 (United States); Yu, H, E-mail: tpeterka@mcs.anl.go [Sandia National Laboratories, California, Livermore, CA 94551 (United States)

    2009-07-01

    Changes are needed in the way that visualization is performed, if we expect the analysis of scientific data to be effective at the petascale and beyond. By using similar techniques as those used to parallelize simulations, such as parallel I/O, load balancing, and effective use of interprocess communication, the supercomputers that compute these datasets can also serve as analysis and visualization engines for them. Our team is assessing the feasibility of performing parallel scientific visualization on some of the most powerful computational resources of the U.S. Department of Energy's National Laboratories in order to pave the way for analyzing the next generation of computational results. This paper highlights some of the conclusions of that research.

  4. Parallel visualization on leadership computing resources

    International Nuclear Information System (INIS)

    Peterka, T; Ross, R B; Shen, H-W; Ma, K-L; Kendall, W; Yu, H

    2009-01-01

    Changes are needed in the way that visualization is performed, if we expect the analysis of scientific data to be effective at the petascale and beyond. By using similar techniques as those used to parallelize simulations, such as parallel I/O, load balancing, and effective use of interprocess communication, the supercomputers that compute these datasets can also serve as analysis and visualization engines for them. Our team is assessing the feasibility of performing parallel scientific visualization on some of the most powerful computational resources of the U.S. Department of Energy's National Laboratories in order to pave the way for analyzing the next generation of computational results. This paper highlights some of the conclusions of that research.

  5. Automating usability of ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Tupputi, S A; Girolamo, A Di; Kouba, T; Schovancová, J

    2014-01-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.

  6. Patient centered integrated clinical resource management.

    Science.gov (United States)

    Hofdijk, Jacob

    2011-01-01

    The impact of funding systems on the IT systems of providers has been enormous and have prevented the implementation of designs to focused on the health issue of patients. The paradigm shift the Dutch Ministry of Health has taken in funding health care has a remarkable impact on the orientation of IT systems design. Since 2007 the next step is taken: the application of the funding concept on chronic diseases using clinical standards as the norm. The focus on prevention involves the patient as an active partner in the care plan. The impact of the new dimension in funding has initiated a process directed to the development of systems to support collaborative working and an active involvement of the patient and its informal carers. This national approach will be presented to assess its international potential, as all countries face the long term care crisis lacking resources to meet the health needs of the population.

  7. Survivable resource orchestration for optically interconnected data center networks.

    Science.gov (United States)

    Zhang, Qiong; She, Qingya; Zhu, Yi; Wang, Xi; Palacharla, Paparao; Sekiya, Motoyoshi

    2014-01-13

    We propose resource orchestration schemes in overlay networks enabled by optical network virtualization. Based on the information from underlying optical networks, our proposed schemes provision the fewest data centers to guarantee K-connect survivability, thus maintaining resource availability for cloud applications under any failure.

  8. NMRbox: A Resource for Biomolecular NMR Computation.

    Science.gov (United States)

    Maciejewski, Mark W; Schuyler, Adam D; Gryk, Michael R; Moraru, Ion I; Romero, Pedro R; Ulrich, Eldon L; Eghbalnia, Hamid R; Livny, Miron; Delaglio, Frank; Hoch, Jeffrey C

    2017-04-25

    Advances in computation have been enabling many recent advances in biomolecular applications of NMR. Due to the wide diversity of applications of NMR, the number and variety of software packages for processing and analyzing NMR data is quite large, with labs relying on dozens, if not hundreds of software packages. Discovery, acquisition, installation, and maintenance of all these packages is a burdensome task. Because the majority of software packages originate in academic labs, persistence of the software is compromised when developers graduate, funding ceases, or investigators turn to other projects. To simplify access to and use of biomolecular NMR software, foster persistence, and enhance reproducibility of computational workflows, we have developed NMRbox, a shared resource for NMR software and computation. NMRbox employs virtualization to provide a comprehensive software environment preconfigured with hundreds of software packages, available as a downloadable virtual machine or as a Platform-as-a-Service supported by a dedicated compute cloud. Ongoing development includes a metadata harvester to regularize, annotate, and preserve workflows and facilitate and enhance data depositions to BioMagResBank, and tools for Bayesian inference to enhance the robustness and extensibility of computational analyses. In addition to facilitating use and preservation of the rich and dynamic software environment for biomolecular NMR, NMRbox fosters the development and deployment of a new class of metasoftware packages. NMRbox is freely available to not-for-profit users. Copyright © 2017 Biophysical Society. All rights reserved.

  9. Efficient management of data center resources for massively multiplayer online games

    NARCIS (Netherlands)

    Nae, V.; Iosup, A.; Podlipnig, S.; Prodan, R.; Epema, D.H.J.; Fahringer, T.

    2008-01-01

    Today's massively multiplayer online games (MMOGs) can include millions of concurrent players spread across the world. To keep these highly-interactive virtual environments online, a MMOG operator may need to provision tens of thousands of computing resources from various data centers. Faced with

  10. Designing and Implementing a Parenting Resource Center for Pregnant Teens

    Science.gov (United States)

    Broussard, Anne B; Broussard, Brenda S

    2009-01-01

    The Resource Center for Young Parents-To-Be is a longstanding and successful grant-funded project that was initiated as a response to an identified community need. Senior-level baccalaureate nursing students and their maternity-nursing instructors are responsible for staffing the resource center's weekly sessions, which take place at a public school site for pregnant adolescents. Childbirth educators interested in working with this population could assist in replicating this exemplary clinical project in order to provide prenatal education to this vulnerable and hard-to-reach group. PMID:20190852

  11. Nursing Reference Center: a point-of-care resource.

    Science.gov (United States)

    Vardell, Emily; Paulaitis, Gediminas Geddy

    2012-01-01

    Nursing Reference Center is a point-of-care resource designed for the practicing nurse, as well as nursing administrators, nursing faculty, and librarians. Users can search across multiple resources, including topical Quick Lessons, evidence-based care sheets, patient education materials, practice guidelines, and more. Additional features include continuing education modules, e-books, and a new iPhone application. A sample search and comparison with similar databases were conducted.

  12. Biosecurity and Health Monitoring at the Zebrafish International Resource Center

    OpenAIRE

    Murray, Katrina N.; Varga, Zolt?n M.; Kent, Michael L.

    2016-01-01

    The Zebrafish International Resource Center (ZIRC) is a repository and distribution center for mutant, transgenic, and wild-type zebrafish. In recent years annual imports of new zebrafish lines to ZIRC have increased tremendously. In addition, after 15 years of research, we have identified some of the most virulent pathogens affecting zebrafish that should be avoided in large production facilities, such as ZIRC. Therefore, while importing a high volume of new lines we prioritize safeguarding ...

  13. Contract on using computer resources of another

    Directory of Open Access Journals (Sweden)

    Cvetković Mihajlo

    2016-01-01

    Full Text Available Contractual relations involving the use of another's property are quite common. Yet, the use of computer resources of others over the Internet and legal transactions arising thereof certainly diverge from the traditional framework embodied in the special part of contract law dealing with this issue. Modern performance concepts (such as: infrastructure, software or platform as high-tech services are highly unlikely to be described by the terminology derived from Roman law. The overwhelming novelty of high-tech services obscures the disadvantageous position of contracting parties. In most cases, service providers are global multinational companies which tend to secure their own unjustified privileges and gain by providing lengthy and intricate contracts, often comprising a number of legal documents. General terms and conditions in these service provision contracts are further complicated by the '.service level agreement', rules of conduct and (nonconfidentiality guarantees. Without giving the issue a second thought, users easily accept the pre-fabricated offer without reservations, unaware that such a pseudo-gratuitous contract actually conceals a highly lucrative and mutually binding agreement. The author examines the extent to which the legal provisions governing sale of goods and services, lease, loan and commodatum may apply to 'cloud computing' contracts, and analyses the scope and advantages of contractual consumer protection, as a relatively new area in contract law. The termination of a service contract between the provider and the user features specific post-contractual obligations which are inherent to an online environment.

  14. Electronic Commerce Resource Centers. An Industry--University Partnership.

    Science.gov (United States)

    Gulledge, Thomas R.; Sommer, Rainer; Tarimcilar, M. Murat

    1999-01-01

    Electronic Commerce Resource Centers focus on transferring emerging technologies to small businesses through university/industry partnerships. Successful implementation hinges on a strategic operating plan, creation of measurable value for customers, investment in customer-targeted training, and measurement of performance outputs. (SK)

  15. Building an Information Resource Center for Competitive Intelligence.

    Science.gov (United States)

    Martin, J. Sperling

    1992-01-01

    Outlines considerations in the design of a Competitive Intelligence Information Resource Center (CIIRC), which is needed by business organizations for effective strategic decision making. Discussed are user needs, user participation, information sources, technology and interface design, operational characteristics, and planning for implementation.…

  16. Center for Computing Research Summer Research Proceedings 2015.

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, Andrew Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Parks, Michael L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-18

    The Center for Computing Research (CCR) at Sandia National Laboratories organizes a summer student program each summer, in coordination with the Computer Science Research Institute (CSRI) and Cyber Engineering Research Institute (CERI).

  17. The NIH-NIAID Filariasis Research Reagent Resource Center.

    Directory of Open Access Journals (Sweden)

    Michelle L Michalski

    2011-11-01

    Full Text Available Filarial worms cause a variety of tropical diseases in humans; however, they are difficult to study because they have complex life cycles that require arthropod intermediate hosts and mammalian definitive hosts. Research efforts in industrialized countries are further complicated by the fact that some filarial nematodes that cause disease in humans are restricted in host specificity to humans alone. This potentially makes the commitment to research difficult, expensive, and restrictive. Over 40 years ago, the United States National Institutes of Health-National Institute of Allergy and Infectious Diseases (NIH-NIAID established a resource from which investigators could obtain various filarial parasite species and life cycle stages without having to expend the effort and funds necessary to maintain the entire life cycles in their own laboratories. This centralized resource (The Filariasis Research Reagent Resource Center, or FR3 translated into cost savings to both NIH-NIAID and to principal investigators by freeing up personnel costs on grants and allowing investigators to divert more funds to targeted research goals. Many investigators, especially those new to the field of tropical medicine, are unaware of the scope of materials and support provided by the FR3. This review is intended to provide a short history of the contract, brief descriptions of the fiilarial species and molecular resources provided, and an estimate of the impact the resource has had on the research community, and describes some new additions and potential benefits the resource center might have for the ever-changing research interests of investigators.

  18. Bioinformatics and Computational Core Technology Center

    Data.gov (United States)

    Federal Laboratory Consortium — SERVICES PROVIDED BY THE COMPUTER CORE FACILITYEvaluation, purchase, set up, and maintenance of the computer hardware and network for the 170 users in the research...

  19. 2014 Mid-Atlantic Telehealth Resource Center Annual Summit

    Directory of Open Access Journals (Sweden)

    Katharine Hsu Wibberly

    2013-12-01

    Full Text Available The Mid-Atlantic Resource Center (MATRC; http://www.matrc.org/ advances the adoption and utilization of telehealth within the MATRC region and works collaboratively with the other federally funded Telehealth Resource Centers to accomplish the same nationally. MATRC offers technical assistance and other resources within the following mid-Atlantic states: Delaware, District of Columbia, Kentucky, Maryland, North Carolina, Pennsylvania, Virginia and West Virginia.   The 2014 MATRC Summit “Adding Value through Sustainable Telehealth” will be held March 30-April 1, 2014, at the Fredericksburg Expo & Conference Center, Fredericksburg, VA. The Summit will explore how telehealth adds value to patients, practitioners, hospitals, health systems, and other facilities. Participants will experience a highly interactive program built around the case history of “Mr. Doe” as he progresses through the primary care, inpatient hospitalization, and post-discharge environments. The Summit will conclude with a session on financial and business models for providing sustainable telehealth services.   For further information and registration, visit: http://matrc.org/component/content/article/2-uncategorised/80-mid-atlantic-telehealth-resource-summit-2014    

  20. Amarillo National Resource Center for Plutonium 1999 plan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-01-30

    The purpose of the Amarillo National Resource Center for Plutonium is to serve the Texas Panhandle, the State of Texas and the US Department of Energy by: conducting scientific and technical research; advising decision makers; and providing information on nuclear weapons materials and related environment, safety, health, and nonproliferation issues while building academic excellence in science and technology. This paper describes the electronic resource library which provides the national archives of technical, policy, historical, and educational information on plutonium. Research projects related to the following topics are described: Environmental restoration and protection; Safety and health; Waste management; Education; Training; Instrumentation development; Materials science; Plutonium processing and handling; and Storage.

  1. Amarillo National Resource Center for Plutonium 1999 plan

    International Nuclear Information System (INIS)

    1999-01-01

    The purpose of the Amarillo National Resource Center for Plutonium is to serve the Texas Panhandle, the State of Texas and the US Department of Energy by: conducting scientific and technical research; advising decision makers; and providing information on nuclear weapons materials and related environment, safety, health, and nonproliferation issues while building academic excellence in science and technology. This paper describes the electronic resource library which provides the national archives of technical, policy, historical, and educational information on plutonium. Research projects related to the following topics are described: Environmental restoration and protection; Safety and health; Waste management; Education; Training; Instrumentation development; Materials science; Plutonium processing and handling; and Storage

  2. LHCb Computing Resources: 2019 requests and reassessment of 2018 requests

    CERN Document Server

    Bozzi, Concezio

    2017-01-01

    This document presents the computing resources needed by LHCb in 2019 and a reassessment of the 2018 requests, as resulting from the current experience of Run2 data taking and minor changes in the LHCb computing model parameters.

  3. Some issues of creation of belarusian language computer resources

    OpenAIRE

    Rubashko, N.; Nevmerjitskaia, G.

    2003-01-01

    The main reason for creation of computer resources of natural language is the necessity to bring into accord the ways of language normalization with the form of its existence - the computer form of language usage should correspond to the computer form of language standards fixation. This paper discusses various aspects of the creation of Belarusian language computer resources. It also briefly gives an overview of the objectives of the project involved.

  4. Building the Teraflops/Petabytes Production Computing Center

    International Nuclear Information System (INIS)

    Kramer, William T.C.; Lucas, Don; Simon, Horst D.

    1999-01-01

    In just one decade, the 1990s, supercomputer centers have undergone two fundamental transitions which require rethinking their operation and their role in high performance computing. The first transition in the early to mid-1990s resulted from a technology change in high performance computing architecture. Highly parallel distributed memory machines built from commodity parts increased the operational complexity of the supercomputer center, and required the introduction of intellectual services as equally important components of the center. The second transition is happening in the late 1990s as centers are introducing loosely coupled clusters of SMPs as their premier high performance computing platforms, while dealing with an ever-increasing volume of data. In addition, increasing network bandwidth enables new modes of use of a supercomputer center, in particular, computational grid applications. In this paper we describe what steps NERSC is taking to address these issues and stay at the leading edge of supercomputing centers.; N

  5. Human resource management in patient-centered pharmaceutical care.

    Science.gov (United States)

    White, S J

    1994-04-01

    Patient-centered care may have the pharmacists and technicians reporting either directly or in a matrix to other than pharmacy administration. The pharmacy administrative people will need to be both effective leaders and managers utilizing excellent human resource management skills. Significant creativity and innovation will be needed for transition from departmental-based services to patient care team services. Changes in the traditional methods of recruiting, interviewing, hiring, training, developing, inspiring, evaluating, and disciplining are required in this new environment.

  6. Integration of Cloud resources in the LHCb Distributed Computing

    CERN Document Server

    Ubeda Garcia, Mario; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-01-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) – instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keepin...

  7. Database Resources of the BIG Data Center in 2018.

    Science.gov (United States)

    2018-01-04

    The BIG Data Center at Beijing Institute of Genomics (BIG) of the Chinese Academy of Sciences provides freely open access to a suite of database resources in support of worldwide research activities in both academia and industry. With the vast amounts of omics data generated at ever-greater scales and rates, the BIG Data Center is continually expanding, updating and enriching its core database resources through big-data integration and value-added curation, including BioCode (a repository archiving bioinformatics tool codes), BioProject (a biological project library), BioSample (a biological sample library), Genome Sequence Archive (GSA, a data repository for archiving raw sequence reads), Genome Warehouse (GWH, a centralized resource housing genome-scale data), Genome Variation Map (GVM, a public repository of genome variations), Gene Expression Nebulas (GEN, a database of gene expression profiles based on RNA-Seq data), Methylation Bank (MethBank, an integrated databank of DNA methylomes), and Science Wikis (a series of biological knowledge wikis for community annotations). In addition, three featured web services are provided, viz., BIG Search (search as a service; a scalable inter-domain text search engine), BIG SSO (single sign-on as a service; a user access control system to gain access to multiple independent systems with a single ID and password) and Gsub (submission as a service; a unified submission service for all relevant resources). All of these resources are publicly accessible through the home page of the BIG Data Center at http://bigd.big.ac.cn. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  8. Web-based tools from AHRQ's National Resource Center.

    Science.gov (United States)

    Cusack, Caitlin M; Shah, Sapna

    2008-11-06

    The Agency for Healthcare Research and Quality (AHRQ) has made an investment of over $216 million in research around health information technology (health IT). As part of their investment, AHRQ has developed the National Resource Center for Health IT (NRC) which includes a public domain Web site. New content for the web site, such as white papers, toolkits, lessons from the health IT portfolio and web-based tools, is developed as needs are identified. Among the tools developed by the NRC are the Compendium of Surveys and the Clinical Decision Support (CDS) Resources. The Compendium of Surveys is a searchable repository of health IT evaluation surveys made available for public use. The CDS Resources contains content which may be used to develop clinical decision support tools, such as rules, reminders and templates. This live demonstration will show the access, use, and content of both these freely available web-based tools.

  9. THE CENTER FOR DATA INTENSIVE COMPUTING

    Energy Technology Data Exchange (ETDEWEB)

    GLIMM,J.

    2002-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  10. THE CENTER FOR DATA INTENSIVE COMPUTING

    International Nuclear Information System (INIS)

    GLIMM, J.

    2001-01-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook

  11. THE CENTER FOR DATA INTENSIVE COMPUTING

    Energy Technology Data Exchange (ETDEWEB)

    GLIMM,J.

    2001-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  12. THE CENTER FOR DATA INTENSIVE COMPUTING

    Energy Technology Data Exchange (ETDEWEB)

    GLIMM,J.

    2003-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  13. Human-centered Computing: Toward a Human Revolution

    OpenAIRE

    Jaimes, Alejandro; Gatica-Perez, Daniel; Sebe, Nicu; Huang, Thomas S.

    2007-01-01

    Human-centered computing studies the design, development, and deployment of mixed-initiative human-computer systems. HCC is emerging from the convergence of multiple disciplines that are concerned both with understanding human beings and with the design of computational artifacts.

  14. Senior Computational Scientist | Center for Cancer Research

    Science.gov (United States)

    The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP),

  15. Computational Physics Program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1984-12-01

    The principal objective of the computational physics group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. A summary of the groups activities is presented, including computational studies in MHD equilibria and stability, plasma transport, Fokker-Planck, and efficient numerical and programming algorithms. References are included

  16. Resource management in utility and cloud computing

    CERN Document Server

    Zhao, Han

    2013-01-01

    This SpringerBrief reviews the existing market-oriented strategies for economically managing resource allocation in distributed systems. It describes three new schemes that address cost-efficiency, user incentives, and allocation fairness with regard to different scheduling contexts. The first scheme, taking the Amazon EC2? market as a case of study, investigates the optimal resource rental planning models based on linear integer programming and stochastic optimization techniques. This model is useful to explore the interaction between the cloud infrastructure provider and the cloud resource c

  17. Biosecurity and Health Monitoring at the Zebrafish International Resource Center.

    Science.gov (United States)

    Murray, Katrina N; Varga, Zoltán M; Kent, Michael L

    2016-07-01

    The Zebrafish International Resource Center (ZIRC) is a repository and distribution center for mutant, transgenic, and wild-type zebrafish. In recent years annual imports of new zebrafish lines to ZIRC have increased tremendously. In addition, after 15 years of research, we have identified some of the most virulent pathogens affecting zebrafish that should be avoided in large production facilities, such as ZIRC. Therefore, while importing a high volume of new lines we prioritize safeguarding the health of our in-house fish colony. Here, we describe the biosecurity and health-monitoring program implemented at ZIRC. This strategy was designed to prevent introduction of new zebrafish pathogens, minimize pathogens already present in the facility, and ensure a healthy zebrafish colony for in-house uses and shipment to customers.

  18. Education resources of the National Center for Biotechnology Information.

    Science.gov (United States)

    Cooper, Peter S; Lipshultz, Dawn; Matten, Wayne T; McGinnis, Scott D; Pechous, Steven; Romiti, Monica L; Tao, Tao; Valjavec-Gratian, Majda; Sayers, Eric W

    2010-11-01

    The National Center for Biotechnology Information (NCBI) hosts 39 literature and molecular biology databases containing almost half a billion records. As the complexity of these data and associated resources and tools continues to expand, so does the need for educational resources to help investigators, clinicians, information specialists and the general public make use of the wealth of public data available at the NCBI. This review describes the educational resources available at NCBI via the NCBI Education page (www.ncbi.nlm.nih.gov/Education/). These resources include materials designed for new users, such as About NCBI and the NCBI Guide, as well as documentation, Frequently Asked Questions (FAQs) and writings on the NCBI Bookshelf such as the NCBI Help Manual and the NCBI Handbook. NCBI also provides teaching materials such as tutorials, problem sets and educational tools such as the Amino Acid Explorer, PSSM Viewer and Ebot. NCBI also offers training programs including the Discovery Workshops, webinars and tutorials at conferences. To help users keep up-to-date, NCBI produces the online NCBI News and offers RSS feeds and mailing lists, along with a presence on Facebook, Twitter and YouTube.

  19. Improving ATLAS computing resource utilization with HammerCloud

    CERN Document Server

    Schovancova, Jaroslava; The ATLAS collaboration

    2018-01-01

    HammerCloud is a framework to commission, test, and benchmark ATLAS computing resources and components of various distributed systems with realistic full-chain experiment workflows. HammerCloud contributes to ATLAS Distributed Computing (ADC) Operations and automation efforts, providing the automated resource exclusion and recovery tools, that help re-focus operational manpower to areas which have yet to be automated, and improve utilization of available computing resources. We present recent evolution of the auto-exclusion/recovery tools: faster inclusion of new resources in testing machinery, machine learning algorithms for anomaly detection, categorized resources as master vs. slave for the purpose of blacklisting, and a tool for auto-exclusion/recovery of resources triggered by Event Service job failures that is being extended to other workflows besides the Event Service. We describe how HammerCloud helped commissioning various concepts and components of distributed systems: simplified configuration of qu...

  20. A Matchmaking Strategy Of Mixed Resource On Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Wisam Elshareef

    2015-08-01

    Full Text Available Abstract Today cloud computing has become a key technology for online allotment of computing resources and online storage of user data in a lower cost where computing resources are available all the time over the Internet with pay per use concept. Recently there is a growing need for resource management strategies in a cloud computing environment that encompass both end-users satisfaction and a high job submission throughput with appropriate scheduling. One of the major and essential issues in resource management is related to allocate incoming tasks to suitable virtual machine matchmaking. The main objective of this paper is to propose a matchmaking strategy between the incoming requests and various resources in the cloud environment to satisfy the requirements of users and to load balance the workload on resources. Load Balancing is an important aspect of resource management in a cloud computing environment. So this paper proposes a dynamic weight active monitor DWAM load balance algorithm which allocates on the fly the incoming requests to the all available virtual machines in an efficient manner in order to achieve better performance parameters such as response time processing time and resource utilization. The feasibility of the proposed algorithm is analyzed using Cloudsim simulator which proves the superiority of the proposed DWAM algorithm over its counterparts in literature. Simulation results demonstrate that proposed algorithm dramatically improves response time data processing time and more utilized of resource compared Active monitor and VM-assign algorithms.

  1. Performance of Cloud Computing Centers with Multiple Priority Classes

    NARCIS (Netherlands)

    Ellens, W.; Zivkovic, Miroslav; Akkerboom, J.; Litjens, R.; van den Berg, Hans Leo

    In this paper we consider the general problem of resource provisioning within cloud computing. We analyze the problem of how to allocate resources to different clients such that the service level agreements (SLAs) for all of these clients are met. A model with multiple service request classes

  2. Computational-physics program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1982-02-01

    The computational physics group is ivolved in several areas of fusion research. One main area is the application of multidimensional Fokker-Planck, transport and combined Fokker-Planck/transport codes to both toroidal and mirror devices. Another major area is the investigation of linear and nonlinear resistive magnetohydrodynamics in two and three dimensions, with applications to all types of fusion devices. The MHD work is often coupled with the task of numerically generating equilibria which model experimental devices. In addition to these computational physics studies, investigations of more efficient numerical algorithms are being carried out

  3. Lecture 4: Cloud Computing in Large Computer Centers

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    This lecture will introduce Cloud Computing concepts identifying and analyzing its characteristics, models, and applications. Also, you will learn how CERN built its Cloud infrastructure and which tools are been used to deploy and manage it. About the speaker: Belmiro Moreira is an enthusiastic software engineer passionate about the challenges and complexities of architecting and deploying Cloud Infrastructures in ve...

  4. The computational physics program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1988-01-01

    The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generation of supercomputers. The computational physics group is involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to compact toroids. Another major area is the investigation of kinetic instabilities using a 3-D particle code. This work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence are being examined. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers

  5. Lower Savannah aging, disability & transportation resource center : regional travel management and coordination center (TMCC) model and demonstration project.

    Science.gov (United States)

    2014-10-01

    This report details the deployed technology and implementation experiences of the Lower Savannah Aging, Disability & Transportation : Resource Center in Aiken, South Carolina, which served as the regional Travel Management and Coordination Center (TM...

  6. Fluor Hanford ALARA Center is a D and D Resource

    International Nuclear Information System (INIS)

    Waggoner, L.O.

    2008-01-01

    II. The ALARA Center staff routinely researches and tests new technology, sponsor vendor demonstrations, and redistribute tools, equipment and temporary shielding that may not be needed at one facility to another facility that needs it. The ALARA Center staff learns about new technology in several ways. This includes past radiological work experience, interaction with vendors, lessons learned, networking with other DOE sites, visits to the Hanford Technical Library, attendance at off-site conferences and ALARA Workshops. Personnel that contact the ALARA Center for assistance report positive results when they implement the tools, equipment and work practices recommended by the ALARA Center staff. This has translated to reduced exposure for workers and reduced the risk of contamination spread. For example: using a hydraulic shear on one job saved 16 Rem of exposure that would have been received if workers had used saws-all tools to cut piping in twenty-nine locations. Currently, the ALARA Center staff is emphasizing D and D techniques on size-reducing materials, decontamination techniques, use of remote tools/video equipment, capture ventilation, fixatives, using containments and how to find lessons learned. The ALARA Center staff issues a weekly report that discusses their interaction with the workforce and any new work practices, tools and equipment being used by the Hanford contractors. Distribution of this weekly report is to about 130 personnel on site and 90 personnel off site. This effectively spreads the word about ALARA throughout the DOE Complex. DOE EM-23, in conjunction with the D and D and Environmental Restoration work group of the Energy Facility Contractors Organization (EFCOG) established the Hanford ALARA Center as the D and D Hotline for companies who have questions about how D and D work is accomplished. The ALARA Center has become a resource to the nuclear industry and routinely helps contractors at other DOE Sites, power reactors, DOD sites, and

  7. Decentralized Resource Management in Distributed Computer Systems.

    Science.gov (United States)

    1982-02-01

    directly exchanging user state information. Eventcounts and sequencers correspond to semaphores in the sense that synchronization primitives are used to...and techniques are required to achieve synchronization in distributed computers without reliance on any centralized entity such as a semaphore ...known solutions to the access synchronization problem was Dijkstra’s semaphore [12]. The importance of the semaphore is that it correctly addresses the

  8. Physical-resource requirements and the power of quantum computation

    International Nuclear Information System (INIS)

    Caves, Carlton M; Deutsch, Ivan H; Blume-Kohout, Robin

    2004-01-01

    The primary resource for quantum computation is the Hilbert-space dimension. Whereas Hilbert space itself is an abstract construction, the number of dimensions available to a system is a physical quantity that requires physical resources. Avoiding a demand for an exponential amount of these resources places a fundamental constraint on the systems that are suitable for scalable quantum computation. To be scalable, the number of degrees of freedom in the computer must grow nearly linearly with the number of qubits in an equivalent qubit-based quantum computer. These considerations rule out quantum computers based on a single particle, a single atom, or a single molecule consisting of a fixed number of atoms or on classical waves manipulated using the transformations of linear optics

  9. Getting the Most from Distributed Resources With an Analytics Platform for ATLAS Computing Services

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00225336; The ATLAS collaboration; Gardner, Robert; Bryant, Lincoln

    2016-01-01

    To meet a sharply increasing demand for computing resources for LHC Run 2, ATLAS distributed computing systems reach far and wide to gather CPU resources and storage capacity to execute an evolving ecosystem of production and analysis workflow tools. Indeed more than a hundred computing sites from the Worldwide LHC Computing Grid, plus many “opportunistic” facilities at HPC centers, universities, national laboratories, and public clouds, combine to meet these requirements. These resources have characteristics (such as local queuing availability, proximity to data sources and target destinations, network latency and bandwidth capacity, etc.) affecting the overall processing efficiency and throughput. To quantitatively understand and in some instances predict behavior, we have developed a platform to aggregate, index (for user queries), and analyze the more important information streams affecting performance. These data streams come from the ATLAS production system (PanDA), the distributed data management s...

  10. Japan's silver human resource centers and participant well-being.

    Science.gov (United States)

    Weiss, Robert S; Bass, Scott A; Heimovitz, Harley K; Oka, Masato

    2005-03-01

    Japan's Silver Human Resource Center (SHRC) program provides part-time, paid employment to retirement-aged men and women. We studied 393 new program participants and examined whether part-time work influenced their well-being or "ikigai." The participants were divided into those who had worked in SHRC-provided jobs in the preceding year, and those who had not. Gender-stratified regression models were fitted to determine whether SHRC employment was associated with increased well-being. For men, actively working at a SHRC job was associated with greater well-being, compared to inactive members. And men with SHRC jobs and previous volunteering experience had the greatest increase in well-being. Women SHRC job holders did not experience increased well-being at the year's end. The study concludes that there is justification for exploring the usefulness of a similar program for American retirees who desire post-retirement part-time work.

  11. Development of Computer-Based Resources for Textile Education.

    Science.gov (United States)

    Hopkins, Teresa; Thomas, Andrew; Bailey, Mike

    1998-01-01

    Describes the production of computer-based resources for students of textiles and engineering in the United Kingdom. Highlights include funding by the Teaching and Learning Technology Programme (TLTP), courseware author/subject expert interaction, usage test and evaluation, authoring software, graphics, computer-aided design simulation, self-test…

  12. Development of a center for biosystmeatics resources. Progress report, November 1, 1978-October 31, 1979

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, S.R.

    1979-11-01

    The objective in the development of a Center for Biosystematics Resources is to provide a centralized source of information regarding the biological expertise available in the academic/museum community; and the federal and state regulations concerning the acquisition, transport, and possession of biological specimens. Such a Center would serve to facilitate access to this widely dispersed information. The heart of the Center is a series of computer assisted data bases which contain information on biologists and their areas of expertise, biological collections, annotated federal regulations, and federal and state controlled species lists. The purpose of this three-year contract with the Department of Energy is to continue the updating and revision of these data bases, make the information they contain readily available to the Department of Energy, other government agencies, the private sector, and the academic community; and to achieve financial independence by the end of the three-year period.

  13. iTools: a framework for classification, categorization and integration of computational biology resources.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2008-05-01

    Full Text Available The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long

  14. ResourceGate: A New Solution for Cloud Computing Resource Allocation

    OpenAIRE

    Abdullah A. Sheikh

    2012-01-01

    Cloud computing has taken place to be focused by educational and business communities. These concerns include their needs to improve the Quality of Services (QoS) provided, also services such as reliability, performance and reducing costs. Cloud computing provides many benefits in terms of low cost and accessibility of data. Ensuring these benefits is considered to be the major factor in the cloud computing environment. This paper surveys recent research related to cloud computing resource al...

  15. A Descriptive Study towards Green Computing Practice Application for Data Centers in IT Based Industries

    Directory of Open Access Journals (Sweden)

    Anthony Jnr. Bokolo

    2018-01-01

    Full Text Available The progressive upsurge in demand for processing and computing power has led to a subsequent upsurge in data center carbon emissions, cost incurred, unethical waste management, depletion of natural resources and high energy utilization. This raises the issue of the sustainability attainment in data centers of Information Technology (IT based industries. Green computing practice can be applied to facilitate sustainability attainment as IT based industries utilizes data centers to provide services to staffs, practitioners and end users. But it is a known fact that enterprise servers utilize huge quantity of energy and incur other expenditures in cooling operations and it is difficult to address the needs of accuracy and efficiency in data centers while yet encouraging a greener application practice alongside cost reduction. Thus this research study focus on the practice application of Green computing in data centers which houses servers and as such presents the Green computing life cycle strategies and best practices to be practiced for better management in data centers in IT based industries. Data was collected through questionnaire from 133 respondents in industries that currently operate their in-house data centers. The analysed data was used to verify the Green computing life cycle strategies presented in this study. Findings from the data shows that each of the life cycles strategies is significant in assisting IT based industries apply Green computing practices in their data centers. This study would be of interest to knowledge and data management practitioners as well as environmental manager and academicians in deploying Green data centers in their organizations.

  16. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Science.gov (United States)

    Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci

    2015-01-01

    Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  17. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Directory of Open Access Journals (Sweden)

    Bruno Guazzelli Batista

    Full Text Available Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  18. Distributed Problem Solving: Adaptive Networks with a Computer Intermediary Resource. Intelligent Executive Computer Communication

    Science.gov (United States)

    1991-06-01

    Proceedings of The National Conference on Artificial Intelligence , pages 181-184, The American Association for Aritificial Intelligence , Pittsburgh...Intermediary Resource: Intelligent Executive Computer Communication John Lyman and Carla J. Conaway University of California at Los Angeles for Contracting...Include Security Classification) Interim Report: Distributed Problem Solving: Adaptive Networks With a Computer Intermediary Resource: Intelligent

  19. Integration of cloud resources in the LHCb distributed computing

    International Nuclear Information System (INIS)

    García, Mario Úbeda; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel; Muñoz, Víctor Méndez

    2014-01-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) – instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  20. Integration of Cloud resources in the LHCb Distributed Computing

    Science.gov (United States)

    Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-06-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  1. Processing Optimization of Typed Resources with Synchronized Storage and Computation Adaptation in Fog Computing

    Directory of Open Access Journals (Sweden)

    Zhengyang Song

    2018-01-01

    Full Text Available Wide application of the Internet of Things (IoT system has been increasingly demanding more hardware facilities for processing various resources including data, information, and knowledge. With the rapid growth of generated resource quantity, it is difficult to adapt to this situation by using traditional cloud computing models. Fog computing enables storage and computing services to perform at the edge of the network to extend cloud computing. However, there are some problems such as restricted computation, limited storage, and expensive network bandwidth in Fog computing applications. It is a challenge to balance the distribution of network resources. We propose a processing optimization mechanism of typed resources with synchronized storage and computation adaptation in Fog computing. In this mechanism, we process typed resources in a wireless-network-based three-tier architecture consisting of Data Graph, Information Graph, and Knowledge Graph. The proposed mechanism aims to minimize processing cost over network, computation, and storage while maximizing the performance of processing in a business value driven manner. Simulation results show that the proposed approach improves the ratio of performance over user investment. Meanwhile, conversions between resource types deliver support for dynamically allocating network resources.

  2. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    International Nuclear Information System (INIS)

    Evans, D; Fisk, I; Holzman, B; Pordes, R; Tiradani, A; Melo, A; Sheldon, P; Metson, S

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely 'on-demand' as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the 'base-line' needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  3. Dynamic provisioning of local and remote compute resources with OpenStack

    Science.gov (United States)

    Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.

    2015-12-01

    Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.

  4. Electricity production perspective regarding resource recovery center (RRC) in Malaysia

    International Nuclear Information System (INIS)

    Masoud Aghajani Mir; Noor Ezlin Ahmad Basri; Rawshan Ara Begum; Sanaz Saheri

    2010-01-01

    Waste disposal is a global problem contributing to the ongoing climate change because of large emissions of greenhouse gases. So, using waste material as a resource instead of land filling, the greenhouse gas emissions from landfills are reduced. Also, Waste material can be used for waste incineration with energy recovery, thus decreasing the greenhouse gas emission from energy utilization by changing from fossil fuels to a partly renewable fuel. The production of Refuse Derived Fuels (RDF) involves the mechanical processing of household waste using screens, shredders and separators to recover recyclable materials and to produce a combustible product Regarding Resource Recovery Center/Waste to Energy (RRC/WtE) Facility in Malaysia that located in Semenyih. This System involves the removal of inert and compost able materials followed by pulverization to produce a feedstock which be incinerated in power stations. The purpose of this study is to evaluate and forecasting of the number of these facilities that Kuala Lumpur will need regarding to potential of Municipal Solid Waste (MSW) generation and Refuse Derive Fuel that will be produce from that in future. This plant is able to produce average 7.5 MWh electricity from 700 ton MSW or 200 ton RDF per day that approximately is used 1.8 MWh per day inside the pant and it can sell around 5.7 MWh daily. Kuala Lumpur will generate around 7713 ton MSW per day and it is able to produce 2466 ton RDF per day. Regarding to potential of MSW and RDF generation by 2020 in Kuala Lumpur it will need around 11 plants to treatment of MSW that this number of plants is able to produce around 62.8 MWh electricity per day. (author)

  5. Development of user-centered interfaces to search the knowledge resources of the Virginia Henderson International Nursing Library.

    Science.gov (United States)

    Jones, Josette; Harris, Marcelline; Bagley-Thompson, Cheryl; Root, Jane

    2003-01-01

    This poster describes the development of user-centered interfaces in order to extend the functionality of the Virginia Henderson International Nursing Library (VHINL) from library to web based portal to nursing knowledge resources. The existing knowledge structure and computational models are revised and made complementary. Nurses' search behavior is captured and analyzed, and the resulting search models are mapped to the revised knowledge structure and computational model.

  6. Supporting Human Activities - Exploring Activity-Centered Computing

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Bardram, Jakob

    2002-01-01

    In this paper we explore an activity-centered computing paradigm that is aimed at supporting work processes that are radically different from the ones known from office work. Our main inspiration is healthcare work that is characterized by an extreme degree of mobility, many interruptions, ad-hoc...

  7. Applied Computational Fluid Dynamics at NASA Ames Research Center

    Science.gov (United States)

    Holst, Terry L.; Kwak, Dochan (Technical Monitor)

    1994-01-01

    The field of Computational Fluid Dynamics (CFD) has advanced to the point where it can now be used for many applications in fluid mechanics research and aerospace vehicle design. A few applications being explored at NASA Ames Research Center will be presented and discussed. The examples presented will range in speed from hypersonic to low speed incompressible flow applications. Most of the results will be from numerical solutions of the Navier-Stokes or Euler equations in three space dimensions for general geometry applications. Computational results will be used to highlight the presentation as appropriate. Advances in computational facilities including those associated with NASA's CAS (Computational Aerosciences) Project of the Federal HPCC (High Performance Computing and Communications) Program will be discussed. Finally, opportunities for future research will be presented and discussed. All material will be taken from non-sensitive, previously-published and widely-disseminated work.

  8. Computational geometry lectures at the morningside center of mathematics

    CERN Document Server

    Wang, Ren-Hong

    2003-01-01

    Computational geometry is a borderline subject related to pure and applied mathematics, computer science, and engineering. The book contains articles on various topics in computational geometry, which are based on invited lectures and some contributed papers presented by researchers working during the program on Computational Geometry at the Morningside Center of Mathematics of the Chinese Academy of Science. The opening article by R.-H. Wang gives a nice survey of various aspects of computational geometry, many of which are discussed in more detail in other papers in the volume. The topics include problems of optimal triangulation, splines, data interpolation, problems of curve and surface design, problems of shape control, quantum teleportation, and others.

  9. Silicon Photonics towards Disaggregation of Resources in Data Centers

    Directory of Open Access Journals (Sweden)

    Miltiadis Moralis-Pegios

    2018-01-01

    Full Text Available In this paper, we demonstrate two subsystems based on Silicon Photonics, towards meeting the network requirements imposed by disaggregation of resources in Data Centers. The first one utilizes a 4 × 4 Silicon photonics switching matrix, employing Mach Zehnder Interferometers (MZIs with Electro-Optical phase shifters, directly controlled by a high speed Field Programmable Gate Array (FPGA board for the successful implementation of a Bloom-Filter (BF-label forwarding scheme. The FPGA is responsible for extracting the BF-label from the incoming optical packets, carrying out the BF-based forwarding function, determining the appropriate switching state and generating the corresponding control signals towards conveying incoming packets to the desired output port of the matrix. The BF-label based packet forwarding scheme allows rapid reconfiguration of the optical switch, while at the same time reduces the memory requirements of the node’s lookup table. Successful operation for 10 Gb/s data packets is reported for a 1 × 4 routing layout. The second subsystem utilizes three integrated spiral waveguides, with record-high 2.6 ns/mm2, delay versus footprint efficiency, along with two Semiconductor Optical Amplifier Mach-Zehnder Interferometer (SOA-MZI wavelength converters, to construct a variable optical buffer and a Time Slot Interchange module. Error-free on-chip variable delay buffering from 6.5 ns up to 17.2 ns and successful timeslot interchanging for 10 Gb/s optical packets are presented.

  10. Shared-resource computing for small research labs.

    Science.gov (United States)

    Ackerman, M J

    1982-04-01

    A real time laboratory computer network is described. This network is composed of four real-time laboratory minicomputers located in each of four division laboratories and a larger minicomputer in a centrally located computer room. Off the shelf hardware and software were used with no customization. The network is configured for resource sharing using DECnet communications software and the RSX-11-M multi-user real-time operating system. The cost effectiveness of the shared resource network and multiple real-time processing using priority scheduling is discussed. Examples of utilization within a medical research department are given.

  11. Using OSG Computing Resources with (iLC)Dirac

    CERN Document Server

    AUTHOR|(SzGeCERN)683529; Petric, Marko

    2017-01-01

    CPU cycles for small experiments and projects can be scarce, thus making use of all available resources, whether dedicated or opportunistic, is mandatory. While enabling uniform access to the LCG computing elements (ARC, CREAM), the DIRAC grid interware was not able to use OSG computing elements (GlobusCE, HTCondor-CE) without dedicated support at the grid site through so called 'SiteDirectors', which directly submit to the local batch system. This in turn requires additional dedicated effort for small experiments on the grid site. Adding interfaces to the OSG CEs through the respective grid middleware is therefore allowing accessing them within the DIRAC software without additional sitespecific infrastructure. This enables greater use of opportunistic resources for experiments and projects without dedicated clusters or an established computing infrastructure with the DIRAC software. To allow sending jobs to HTCondor-CE and legacy Globus computing elements inside DIRAC the required wrapper classes were develo...

  12. Center for Computational Wind Turbine Aerodynamics and Atmospheric Turbulence

    DEFF Research Database (Denmark)

    Sørensen, Jens Nørkær

    2014-01-01

    In order to design and operate a wind farm optimally it is necessary to know in detail how the wind behaves and interacts with the turbines in a farm. This not only requires knowledge about meteorology, turbulence and aerodynamics, but it also requires access to powerful computers and efficient s...... software. Center for Computational Wind Turbine Aerodynamics and Atmospheric Turbulence was established in 2010 in order to create a world-leading cross-disciplinary flow center that covers all relevant disciplines within wind farm meteorology and aerodynamics.......In order to design and operate a wind farm optimally it is necessary to know in detail how the wind behaves and interacts with the turbines in a farm. This not only requires knowledge about meteorology, turbulence and aerodynamics, but it also requires access to powerful computers and efficient...

  13. Integration of Openstack cloud resources in BES III computing cluster

    Science.gov (United States)

    Li, Haibo; Cheng, Yaodong; Huang, Qiulan; Cheng, Zhenjing; Shi, Jingyan

    2017-10-01

    Cloud computing provides a new technical means for data processing of high energy physics experiment. However, the resource of each queue is fixed and the usage of the resource is static in traditional job management system. In order to make it simple and transparent for physicist to use, we developed a virtual cluster system (vpmanager) to integrate IHEPCloud and different batch systems such as Torque and HTCondor. Vpmanager provides dynamic virtual machines scheduling according to the job queue. The BES III use case results show that resource efficiency is greatly improved.

  14. Computer-aided resource planning and scheduling for radiological services

    Science.gov (United States)

    Garcia, Hong-Mei C.; Yun, David Y.; Ge, Yiqun; Khan, Javed I.

    1996-05-01

    There exists tremendous opportunity in hospital-wide resource optimization based on system integration. This paper defines the resource planning and scheduling requirements integral to PACS, RIS and HIS integration. An multi-site case study is conducted to define the requirements. A well-tested planning and scheduling methodology, called Constrained Resource Planning model, has been applied to the chosen problem of radiological service optimization. This investigation focuses on resource optimization issues for minimizing the turnaround time to increase clinical efficiency and customer satisfaction, particularly in cases where the scheduling of multiple exams are required for a patient. How best to combine the information system efficiency and human intelligence in improving radiological services is described. Finally, an architecture for interfacing a computer-aided resource planning and scheduling tool with the existing PACS, HIS and RIS implementation is presented.

  15. UC Merced Center for Computational Biology Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Colvin, Michael; Watanabe, Masakatsu

    2010-11-30

    Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformation of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs

  16. L-025: EPR-First Responders: Resource Coordinator and National Center for Emergency Operations

    International Nuclear Information System (INIS)

    2011-01-01

    This conference cover the importance of resource coordinator and the national Center for Emergency Operations which provides a stable environment installation and a valuable aid in the radiological emergency situation.The resources coordinator maintains the registers and resources located in general as well as the National Center for Emergency Operations is the ideal place for the public information Center. Both roles provide support and encourage the efforts to respond to the incident Command

  17. A Semi-Preemptive Computational Service System with Limited Resources and Dynamic Resource Ranking

    Directory of Open Access Journals (Sweden)

    Fang-Yie Leu

    2012-03-01

    Full Text Available In this paper, we integrate a grid system and a wireless network to present a convenient computational service system, called the Semi-Preemptive Computational Service system (SePCS for short, which provides users with a wireless access environment and through which a user can share his/her resources with others. In the SePCS, each node is dynamically given a score based on its CPU level, available memory size, current length of waiting queue, CPU utilization and bandwidth. With the scores, resource nodes are classified into three levels. User requests based on their time constraints are also classified into three types. Resources of higher levels are allocated to more tightly constrained requests so as to increase the total performance of the system. To achieve this, a resource broker with the Semi-Preemptive Algorithm (SPA is also proposed. When the resource broker cannot find suitable resources for the requests of higher type, it preempts the resource that is now executing a lower type request so that the request of higher type can be executed immediately. The SePCS can be applied to a Vehicular Ad Hoc Network (VANET, users of which can then exploit the convenient mobile network services and the wireless distributed computing. As a result, the performance of the system is higher than that of the tested schemes.

  18. Active resources concept of computation for enterprise software

    Directory of Open Access Journals (Sweden)

    Koryl Maciej

    2017-06-01

    Full Text Available Traditional computational models for enterprise software are still to a great extent centralized. However, rapid growing of modern computation techniques and frameworks causes that contemporary software becomes more and more distributed. Towards development of new complete and coherent solution for distributed enterprise software construction, synthesis of three well-grounded concepts is proposed: Domain-Driven Design technique of software engineering, REST architectural style and actor model of computation. As a result new resources-based framework arises, which after first cases of use seems to be useful and worthy of further research.

  19. National Center for Mathematics and Science - teacher resources

    Science.gov (United States)

    Mathematics and Science (NCISLA) HOME | PROGRAM OVERVIEW | RESEARCH AND PROFESSIONAL DEVELOPMENT support and improve student understanding of mathematics and science. The instructional resources listed Resources (CD)Powerful Practices in Mathematics and Science A multimedia product for educators, professional

  20. Use of computers and Internet among people with severe mental illnesses at peer support centers.

    Science.gov (United States)

    Brunette, Mary F; Aschbrenner, Kelly A; Ferron, Joelle C; Ustinich, Lee; Kelly, Michael; Grinley, Thomas

    2017-12-01

    Peer support centers are an ideal setting where people with severe mental illnesses can access the Internet via computers for online health education, peer support, and behavioral treatments. The purpose of this study was to assess computer use and Internet access in peer support agencies. A peer-assisted survey assessed the frequency with which consumers in all 13 New Hampshire peer support centers (n = 702) used computers to access Internet resources. During the 30-day survey period, 200 of the 702 peer support consumers (28%) responded to the survey. More than 3 quarters (78.5%) of respondents had gone online to seek information in the past year. About half (49%) of respondents were interested in learning about online forums that would provide information and peer support for mental health issues. Peer support centers may be a useful venue for Web-based approaches to education, peer support, and intervention. Future research should assess facilitators and barriers to use of Web-based resources among people with severe mental illness in peer support centers. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. Amarillo National Resource Center for Plutonium. Quarterly technical progress report, May 1, 1997--July 31, 1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-09-01

    Progress summaries are provided from the Amarillo National Center for Plutonium. Programs include the plutonium information resource center, environment, public health, and safety, education and training, nuclear and other material studies.

  2. GridFactory - Distributed computing on ephemeral resources

    DEFF Research Database (Denmark)

    Orellana, Frederik; Niinimaki, Marko

    2011-01-01

    A novel batch system for high throughput computing is presented. The system is specifically designed to leverage virtualization and web technology to facilitate deployment on cloud and other ephemeral resources. In particular, it implements a security model suited for forming collaborations...

  3. Can the Teachers' Creativity Overcome Limited Computer Resources?

    Science.gov (United States)

    Nikolov, Rumen; Sendova, Evgenia

    1988-01-01

    Describes experiences of the Research Group on Education (RGE) at the Bulgarian Academy of Sciences and the Ministry of Education in using limited computer resources when teaching informatics. Topics discussed include group projects; the use of Logo; ability grouping; and out-of-class activities, including publishing a pupils' magazine. (13…

  4. Recent development of computational resources for new antibiotics discovery

    DEFF Research Database (Denmark)

    Kim, Hyun Uk; Blin, Kai; Lee, Sang Yup

    2017-01-01

    Understanding a complex working mechanism of biosynthetic gene clusters (BGCs) encoding secondary metabolites is a key to discovery of new antibiotics. Computational resources continue to be developed in order to better process increasing volumes of genome and chemistry data, and thereby better...

  5. Computing Resource And Work Allocations Using Social Profiles

    Directory of Open Access Journals (Sweden)

    Peter Lavin

    2013-01-01

    Full Text Available If several distributed and disparate computer resources exist, many of whichhave been created for different and diverse reasons, and several large scale com-puting challenges also exist with similar diversity in their backgrounds, then oneproblem which arises in trying to assemble enough of these resources to addresssuch challenges is the need to align and accommodate the different motivationsand objectives which may lie behind the existence of both the resources andthe challenges. Software agents are offered as a mainstream technology formodelling the types of collaborations and relationships needed to do this. Asan initial step towards forming such relationships, agents need a mechanism toconsider social and economic backgrounds. This paper explores addressing so-cial and economic differences using a combination of textual descriptions knownas social profiles and search engine technology, both of which are integrated intoan agent technology.

  6. Cloud Computing in Science and Engineering and the “SciShop.ru” Computer Simulation Center

    Directory of Open Access Journals (Sweden)

    E. V. Vorozhtsov

    2011-12-01

    Full Text Available Various aspects of cloud computing applications for scientific research, applied design, and remote education are described in this paper. An analysis of the different aspects is performed based on the experience from the “SciShop.ru” Computer Simulation Center. This analysis shows that cloud computing technology has wide prospects in scientific research applications, applied developments and also remote education of specialists, postgraduates, and students.

  7. Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious

    OpenAIRE

    Cirasella, Jill

    2009-01-01

    This article is an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news.

  8. Towards minimal resources of measurement-based quantum computation

    International Nuclear Information System (INIS)

    Perdrix, Simon

    2007-01-01

    We improve the upper bound on the minimal resources required for measurement-only quantum computation (M A Nielsen 2003 Phys. Rev. A 308 96-100; D W Leung 2004 Int. J. Quantum Inform. 2 33; S Perdrix 2005 Int. J. Quantum Inform. 3 219-23). Minimizing the resources required for this model is a key issue for experimental realization of a quantum computer based on projective measurements. This new upper bound also allows one to reply in the negative to the open question presented by Perdrix (2004 Proc. Quantum Communication Measurement and Computing) about the existence of a trade-off between observable and ancillary qubits in measurement-only QC

  9. Computing Bounds on Resource Levels for Flexible Plans

    Science.gov (United States)

    Muscvettola, Nicola; Rijsman, David

    2009-01-01

    A new algorithm efficiently computes the tightest exact bound on the levels of resources induced by a flexible activity plan (see figure). Tightness of bounds is extremely important for computations involved in planning because tight bounds can save potentially exponential amounts of search (through early backtracking and detection of solutions), relative to looser bounds. The bound computed by the new algorithm, denoted the resource-level envelope, constitutes the measure of maximum and minimum consumption of resources at any time for all fixed-time schedules in the flexible plan. At each time, the envelope guarantees that there are two fixed-time instantiations one that produces the minimum level and one that produces the maximum level. Therefore, the resource-level envelope is the tightest possible resource-level bound for a flexible plan because any tighter bound would exclude the contribution of at least one fixed-time schedule. If the resource- level envelope can be computed efficiently, one could substitute looser bounds that are currently used in the inner cores of constraint-posting scheduling algorithms, with the potential for great improvements in performance. What is needed to reduce the cost of computation is an algorithm, the measure of complexity of which is no greater than a low-degree polynomial in N (where N is the number of activities). The new algorithm satisfies this need. In this algorithm, the computation of resource-level envelopes is based on a novel combination of (1) the theory of shortest paths in the temporal-constraint network for the flexible plan and (2) the theory of maximum flows for a flow network derived from the temporal and resource constraints. The measure of asymptotic complexity of the algorithm is O(N O(maxflow(N)), where O(x) denotes an amount of computing time or a number of arithmetic operations proportional to a number of the order of x and O(maxflow(N)) is the measure of complexity (and thus of cost) of a maximumflow

  10. The NIRA computer program package (photonuclear data center). Final report

    International Nuclear Information System (INIS)

    Vander Molen, H.J.; Gerstenberg, H.M.

    1976-02-01

    The Photonuclear Data Center's NIRA library of programs, executable from mass storage on the National Bureau of Standard's central computer facility, is described. Detailed instructions are given (with examples) for the use of the library to analyze, evaluate, synthesize, and produce for publication camera-ready tabular and graphical presentations of digital photonuclear reaction cross-section data. NIRA is the acronym for Nuclear Information Research Associate

  11. Energy-efficient cloud computing : autonomic resource provisioning for datacenters

    OpenAIRE

    Tesfatsion, Selome Kostentinos

    2018-01-01

    Energy efficiency has become an increasingly important concern in data centers because of issues associated with energy consumption, such as capital costs, operating expenses, and environmental impact. While energy loss due to suboptimal use of facilities and non-IT equipment has largely been reduced through the use of best-practice technologies, addressing energy wastage in IT equipment still requires the design and implementation of energy-aware resource management systems. This thesis focu...

  12. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    International Nuclear Information System (INIS)

    Öhman, Henrik; Panitkin, Sergey; Hendrix, Valerie

    2014-01-01

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. The new cloud technologies also come with new challenges, and one such is the contextualization of computing resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible. This precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration and dynamic resource scaling.

  13. Science center capabilities to monitor and investigate Michigan’s water resources, 2016

    Science.gov (United States)

    Giesen, Julia A.; Givens, Carrie E.

    2016-09-06

    Michigan faces many challenges related to water resources, including flooding, drought, water-quality degradation and impairment, varying water availability, watershed-management issues, stormwater management, aquatic-ecosystem impairment, and invasive species. Michigan’s water resources include approximately 36,000 miles of streams, over 11,000 inland lakes, 3,000 miles of shoreline along the Great Lakes (MDEQ, 2016), and groundwater aquifers throughout the State.The U.S. Geological Survey (USGS) works in cooperation with local, State, and other Federal agencies, as well as tribes and universities, to provide scientific information used to manage the water resources of Michigan. To effectively assess water resources, the USGS uses standardized methods to operate streamgages, water-quality stations, and groundwater stations. The USGS also monitors water quality in lakes and reservoirs, makes periodic measurements along rivers and streams, and maintains all monitoring data in a national, quality-assured, hydrologic database.The USGS in Michigan investigates the occurrence, distribution, quantity, movement, and chemical and biological quality of surface water and groundwater statewide. Water-resource monitoring and scientific investigations are conducted statewide by USGS hydrologists, hydrologic technicians, biologists, and microbiologists who have expertise in data collection as well as various scientific specialties. A support staff consisting of computer-operations and administrative personnel provides the USGS the functionality to move science forward. Funding for USGS activities in Michigan comes from local and State agencies, other Federal agencies, direct Federal appropriations, and through the USGS Cooperative Matching Funds, which allows the USGS to partially match funding provided by local and State partners.This fact sheet provides an overview of the USGS current (2016) capabilities to monitor and study Michigan’s vast water resources. More

  14. 34 CFR 669.1 - What is the Language Resource Centers Program?

    Science.gov (United States)

    2010-07-01

    ... improving the nation's capacity for teaching and learning foreign languages effectively. (Authority: 20 U.S... 34 Education 3 2010-07-01 2010-07-01 false What is the Language Resource Centers Program? 669.1... POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION LANGUAGE RESOURCE CENTERS PROGRAM General § 669.1 What is the...

  15. National Resource Center for Health and Safety in Child Care and Early Education

    Science.gov (United States)

    ... National Resource Center for Health and Safety in Child Care and Early Education (NRC) at the University of Colorado College of ... National Resource Center for Health and Safety in Child Care and Early Education Email: info@NRCKids.org Please read our disclaimer ...

  16. Common accounting system for monitoring the ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Karavakis, E; Andreeva, J; Campana, S; Saiz, P; Gayazov, S; Jezequel, S; Sargsyan, L; Schovancova, J; Ueda, I

    2014-01-01

    This paper covers in detail a variety of accounting tools used to monitor the utilisation of the available computational and storage resources within the ATLAS Distributed Computing during the first three years of Large Hadron Collider data taking. The Experiment Dashboard provides a set of common accounting tools that combine monitoring information originating from many different information sources; either generic or ATLAS specific. This set of tools provides quality and scalable solutions that are flexible enough to support the constantly evolving requirements of the ATLAS user community.

  17. Adaptive Management of Computing and Network Resources for Spacecraft Systems

    Science.gov (United States)

    Pfarr, Barbara; Welch, Lonnie R.; Detter, Ryan; Tjaden, Brett; Huh, Eui-Nam; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    It is likely that NASA's future spacecraft systems will consist of distributed processes which will handle dynamically varying workloads in response to perceived scientific events, the spacecraft environment, spacecraft anomalies and user commands. Since all situations and possible uses of sensors cannot be anticipated during pre-deployment phases, an approach for dynamically adapting the allocation of distributed computational and communication resources is needed. To address this, we are evolving the DeSiDeRaTa adaptive resource management approach to enable reconfigurable ground and space information systems. The DeSiDeRaTa approach embodies a set of middleware mechanisms for adapting resource allocations, and a framework for reasoning about the real-time performance of distributed application systems. The framework and middleware will be extended to accommodate (1) the dynamic aspects of intra-constellation network topologies, and (2) the complete real-time path from the instrument to the user. We are developing a ground-based testbed that will enable NASA to perform early evaluation of adaptive resource management techniques without the expense of first deploying them in space. The benefits of the proposed effort are numerous, including the ability to use sensors in new ways not anticipated at design time; the production of information technology that ties the sensor web together; the accommodation of greater numbers of missions with fewer resources; and the opportunity to leverage the DeSiDeRaTa project's expertise, infrastructure and models for adaptive resource management for distributed real-time systems.

  18. The role of dedicated data computing centers in the age of cloud computing

    Science.gov (United States)

    Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr

    2017-10-01

    Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.

  19. Emotor control: computations underlying bodily resource allocation, emotions, and confidence.

    Science.gov (United States)

    Kepecs, Adam; Mensh, Brett D

    2015-12-01

    Emotional processes are central to behavior, yet their deeply subjective nature has been a challenge for neuroscientific study as well as for psychiatric diagnosis. Here we explore the relationships between subjective feelings and their underlying brain circuits from a computational perspective. We apply recent insights from systems neuroscience-approaching subjective behavior as the result of mental computations instantiated in the brain-to the study of emotions. We develop the hypothesis that emotions are the product of neural computations whose motor role is to reallocate bodily resources mostly gated by smooth muscles. This "emotor" control system is analagous to the more familiar motor control computations that coordinate skeletal muscle movements. To illustrate this framework, we review recent research on "confidence." Although familiar as a feeling, confidence is also an objective statistical quantity: an estimate of the probability that a hypothesis is correct. This model-based approach helped reveal the neural basis of decision confidence in mammals and provides a bridge to the subjective feeling of confidence in humans. These results have important implications for psychiatry, since disorders of confidence computations appear to contribute to a number of psychopathologies. More broadly, this computational approach to emotions resonates with the emerging view that psychiatric nosology may be best parameterized in terms of disorders of the cognitive computations underlying complex behavior.

  20. Secure data exchange between intelligent devices and computing centers

    Science.gov (United States)

    Naqvi, Syed; Riguidel, Michel

    2005-03-01

    The advent of reliable spontaneous networking technologies (commonly known as wireless ad-hoc networks) has ostensibly raised stakes for the conception of computing intensive environments using intelligent devices as their interface with the external world. These smart devices are used as data gateways for the computing units. These devices are employed in highly volatile environments where the secure exchange of data between these devices and their computing centers is of paramount importance. Moreover, their mission critical applications require dependable measures against the attacks like denial of service (DoS), eavesdropping, masquerading, etc. In this paper, we propose a mechanism to assure reliable data exchange between an intelligent environment composed of smart devices and distributed computing units collectively called 'computational grid'. The notion of infosphere is used to define a digital space made up of a persistent and a volatile asset in an often indefinite geographical space. We study different infospheres and present general evolutions and issues in the security of such technology-rich and intelligent environments. It is beyond any doubt that these environments will likely face a proliferation of users, applications, networked devices, and their interactions on a scale never experienced before. It would be better to build in the ability to uniformly deal with these systems. As a solution, we propose a concept of virtualization of security services. We try to solve the difficult problems of implementation and maintenance of trust on the one hand, and those of security management in heterogeneous infrastructure on the other hand.

  1. Multicriteria Resource Brokering in Cloud Computing for Streaming Service

    Directory of Open Access Journals (Sweden)

    Chih-Lun Chou

    2015-01-01

    Full Text Available By leveraging cloud computing such as Infrastructure as a Service (IaaS, the outsourcing of computing resources used to support operations, including servers, storage, and networking components, is quite beneficial for various providers of Internet application. With this increasing trend, resource allocation that both assures QoS via Service Level Agreement (SLA and avoids overprovisioning in order to reduce cost becomes a crucial priority and challenge in the design and operation of complex service-based platforms such as streaming service. On the other hand, providers of IaaS also concern their profit performance and energy consumption while offering these virtualized resources. In this paper, considering both service-oriented and infrastructure-oriented criteria, we regard this resource allocation problem as Multicriteria Decision Making problem and propose an effective trade-off approach based on goal programming model. To validate its effectiveness, a cloud architecture for streaming application is addressed and extensive analysis is performed for related criteria. The results of numerical simulations show that the proposed approach strikes a balance between these conflicting criteria commendably and achieves high cost efficiency.

  2. National Maternal and Child Oral Health Resource Center

    Science.gov (United States)

    ... State Offices Search the Organizations Database Center for Oral Health Systems Integration and Improvement (COHSII) COHSII is a ... needs of the MCH population. Brush Up on Oral Health This monthly newsletter provides Head Start staff with ...

  3. New computer system for the Japan Tier-2 center

    CERN Multimedia

    Hiroyuki Matsunaga

    2007-01-01

    The ICEPP (International Center for Elementary Particle Physics) of the University of Tokyo has been operating an LCG Tier-2 center dedicated to the ATLAS experiment, and is going to switch over to the new production system which has been recently installed. The system will be of great help to the exciting physics analyses for coming years. The new computer system includes brand-new blade servers, RAID disks, a tape library system and Ethernet switches. The blade server is DELL PowerEdge 1955 which contains two Intel dual-core Xeon (WoodCrest) CPUs running at 3GHz, and a total of 650 servers will be used as compute nodes. Each of the RAID disks is configured to be RAID-6 with 16 Serial ATA HDDs. The equipment as well as the cooling system is placed in a new large computer room, and both are hooked up to UPS (uninterruptible power supply) units for stable operation. As a whole, the system has been built with redundant configuration in a cost-effective way. The next major upgrade will take place in thre...

  4. Community Coordinated Modeling Center: A Powerful Resource in Space Science and Space Weather Education

    Science.gov (United States)

    Chulaki, A.; Kuznetsova, M. M.; Rastaetter, L.; MacNeice, P. J.; Shim, J. S.; Pulkkinen, A. A.; Taktakishvili, A.; Mays, M. L.; Mendoza, A. M. M.; Zheng, Y.; Mullinix, R.; Collado-Vega, Y. M.; Maddox, M. M.; Pembroke, A. D.; Wiegand, C.

    2015-12-01

    Community Coordinated Modeling Center (CCMC) is a NASA affiliated interagency partnership with the primary goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research. Additionally, over the past ten years it has established itself as a global space science education resource supporting undergraduate and graduate education and research, and spreading space weather awareness worldwide. A unique combination of assets, capabilities and close ties to the scientific and educational communities enable this small group to serve as a hub for raising generations of young space scientists and engineers. CCMC resources are publicly available online, providing unprecedented global access to the largest collection of modern space science models (developed by the international research community). CCMC has revolutionized the way simulations are utilized in classrooms settings, student projects, and scientific labs and serves hundreds of educators, students and researchers every year. Another major CCMC asset is an expert space weather prototyping team primarily serving NASA's interplanetary space weather needs. Capitalizing on its unrivaled capabilities and experiences, the team provides in-depth space weather training to students and professionals worldwide, and offers an amazing opportunity for undergraduates to engage in real-time space weather monitoring, analysis, forecasting and research. In-house development of state-of-the-art space weather tools and applications provides exciting opportunities to students majoring in computer science and computer engineering fields to intern with the software engineers at the CCMC while also learning about the space weather from the NASA scientists.

  5. 77 FR 72868 - The Centers for Disease Control (CDC)/Health Resources and Services Administration (HRSA...

    Science.gov (United States)

    2012-12-06

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention The Centers for Disease Control (CDC)/Health Resources and Services Administration (HRSA) Advisory Committee on HIV, Viral... announcements of meetings and other committee management activities, for both the Centers for Disease Control...

  6. Development of a center for biosystematics resources. Summary report, November 1, 1979-October 31, 1980

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, S.R.

    1980-11-01

    The objective in the development of a Center for Biosystematics Resources is to provide a centralized source of information regarding the biological expertise available in the academic/museum community; and the federal and state regulations concerning the acquisition, transport, and possession of biological specimens. Such a Center would serve to facilitate access to this widely dispersed information. The heart of the Center is a series of computer assisted data bases which contain information on biologists and their areas of expertise, biological collections, annotated federal regulations, and federal and state controlled species lists. In the last year these data bases have been updated and expanded. Additional data bases have been constructed and are being maintained. The purpose of this three-year contract with the Department of Energy is to continue the updating and revision of the original data bases, make the information they contain readily available to the Department of Energy, other government agencies, the private sector, and the academic community; and to achieve financial independence by the end of the three-year period.

  7. Final Report: Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [William Marsh Rice University

    2011-09-13

    As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

  8. Animation company "Fast Forwards" production with HP Utility Data Center; film built using Adaptive Enterprise framework enabled by shared, virtual resource

    CERN Multimedia

    2003-01-01

    Hewlett Packard have produced a commercial-quality animated film using an experimental rendering service from HP Labs and running on an HP Utility Data Center (UDC). The project demonstrates how computing resources can be managed virtually and illustrates the value of utility computing, in which an end-user taps into a large pool of virtual resources, but pays only for what is used (1 page).

  9. Alternative Fuels Data Center: Codes and Standards Resources

    Science.gov (United States)

    resources linked below help project developers and code officials prepare and review code-compliant projects , storage, and infrastructure. The following charts show the SDOs responsible for these alternative fuel codes and standards. Biodiesel Vehicle and Infrastructure Codes and Standards Chart Electric Vehicle and

  10. High Performance Computing in Science and Engineering '15 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2015. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  11. High Performance Computing in Science and Engineering '17 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael; HLRS 2017

    2018-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2017. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance.The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  12. Plant Resources Center and the Vietnamese genebank system

    Science.gov (United States)

    The highly diverse floristic composition of Vietnam has been recognized as a center of angiosperm expansion and crop biodiversity. The broad range of climatic environments include habitats from tropical and subtropical, to temperate and alpine flora. The human component of the country includes 54 et...

  13. Moving from "optimal resources" to "optimal care" at trauma centers.

    Science.gov (United States)

    Shafi, Shahid; Rayan, Nadine; Barnes, Sunni; Fleming, Neil; Gentilello, Larry M; Ballard, David

    2012-04-01

    The Trauma Quality Improvement Program has shown that risk-adjusted mortality rates at some centers are nearly 50% higher than at others. This "quality gap" may be due to different clinical practices or processes of care. We have previously shown that adoption of processes called core measures by the Joint Commission and Centers for Medicare and Medicaid Services does not improve outcomes of trauma patients. We hypothesized that improved compliance with trauma-specific clinical processes of care (POC) is associated with reduced in-hospital mortality. Records of a random sample of 1,000 patients admitted to a Level I trauma center who met Trauma Quality Improvement Program criteria (age ≥ 16 years and Abbreviated Injury Scale score 3) were retrospectively reviewed for compliance with 25 trauma-specific POC (T-POC) that were evidence-based or expert consensus panel recommendations. Multivariate regression was used to determine the relationship between T-POC compliance and in-hospital mortality, adjusted for age, gender, injury type, and severity. Median age was 41 years, 65% were men, 88% sustained a blunt injury, and mortality was 12%. Of these, 77% were eligible for at least one T-POC and 58% were eligible for two or more. There was wide variation in T-POC compliance. Every 10% increase in compliance was associated with a 14% reduction in risk-adjusted in-hospital mortality. Unlike adoption of core measures, compliance with T-POC is associated with reduced mortality in trauma patients. Trauma centers with excess in-hospital mortality may improve patient outcomes by consistently applying T-POC. These processes should be explored for potential use as Core Trauma Center Performance Measures.

  14. Cost-Benefit Analysis of Computer Resources for Machine Learning

    Science.gov (United States)

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  15. Fiscal Year 1988 program report: Rhode Island Water Resources Center

    International Nuclear Information System (INIS)

    Poon, C.P.C.

    1989-07-01

    The State of Rhode Island is active in water resources planning, development, and management activities which include legislation, upgrading of wastewater treatment facilities, upgrading and implementing pretreatment programs, protecting watersheds and aquifers throughout the state. Current and anticipated state water problems are contamination and clean up of aquifers to protect the valuable groundwater resources; protection of watersheds by controlling non-point source pollution; development of pretreatment technologies; and deterioring groundwater quality from landfill leachate or drainage from septic tank leaching field. Seven projects were included covering the following subjects: (1) Radon and its nuclei parents in bedrocks; (2) Model for natural flushing of aquifer; (3) Microbial treatment of heavy metals; (4) Vegetative uptake of nitrate; (5) Microbial process in vegetative buffer strips; (6) Leachate characterization in landfills; and (7) Electrochemical treatment of heavy metals and cyanide

  16. Measuring the impact of computer resource quality on the software development process and product

    Science.gov (United States)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  17. Mobile devices and computing cloud resources allocation for interactive applications

    Directory of Open Access Journals (Sweden)

    Krawczyk Henryk

    2017-06-01

    Full Text Available Using mobile devices such as smartphones or iPads for various interactive applications is currently very common. In the case of complex applications, e.g. chess games, the capabilities of these devices are insufficient to run the application in real time. One of the solutions is to use cloud computing. However, there is an optimization problem of mobile device and cloud resources allocation. An iterative heuristic algorithm for application distribution is proposed. The algorithm minimizes the energy cost of application execution with constrained execution time.

  18. Building a Prototype of LHC Analysis Oriented Computing Centers

    Science.gov (United States)

    Bagliesi, G.; Boccali, T.; Della Ricca, G.; Donvito, G.; Paganoni, M.

    2012-12-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  19. Building a Prototype of LHC Analysis Oriented Computing Centers

    International Nuclear Information System (INIS)

    Bagliesi, G; Boccali, T; Della Ricca, G; Donvito, G; Paganoni, M

    2012-01-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  20. Negative quasi-probability as a resource for quantum computation

    International Nuclear Information System (INIS)

    Veitch, Victor; Ferrie, Christopher; Emerson, Joseph; Gross, David

    2012-01-01

    A central problem in quantum information is to determine the minimal physical resources that are required for quantum computational speed-up and, in particular, for fault-tolerant quantum computation. We establish a remarkable connection between the potential for quantum speed-up and the onset of negative values in a distinguished quasi-probability representation, a discrete analogue of the Wigner function for quantum systems of odd dimension. This connection allows us to resolve an open question on the existence of bound states for magic state distillation: we prove that there exist mixed states outside the convex hull of stabilizer states that cannot be distilled to non-stabilizer target states using stabilizer operations. We also provide an efficient simulation protocol for Clifford circuits that extends to a large class of mixed states, including bound universal states. (paper)

  1. An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center

    Science.gov (United States)

    Gleason, J. L.; Little, M. M.

    2013-12-01

    NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.

  2. Enabling Grid Computing resources within the KM3NeT computing model

    Directory of Open Access Journals (Sweden)

    Filippidis Christos

    2016-01-01

    Full Text Available KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that – located at the bottom of the Mediterranean Sea – will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  3. Next Generation Computer Resources: Reference Model for Project Support Environments (Version 2.0)

    National Research Council Canada - National Science Library

    Brown, Alan

    1993-01-01

    The objective of the Next Generation Computer Resources (NGCR) program is to restructure the Navy's approach to acquisition of standard computing resources to take better advantage of commercial advances and investments...

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  5. Research and development of grid computing technology in center for computational science and e-systems of Japan Atomic Energy Agency

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of the Japan Atomic Energy Agency (CCSE/JAEA) has carried out R and D of grid computing technology. Since 1995, R and D to realize computational assistance for researchers called Seamless Thinking Aid (STA) and then to share intellectual resources called Information Technology Based Laboratory (ITBL) have been conducted, leading to construct an intelligent infrastructure for the atomic energy research called Atomic Energy Grid InfraStructure (AEGIS) under the Japanese national project 'Development and Applications of Advanced High-Performance Supercomputer'. It aims to enable synchronization of three themes: 1) Computer-Aided Research and Development (CARD) to realize and environment for STA, 2) Computer-Aided Engineering (CAEN) to establish Multi Experimental Tools (MEXT), and 3) Computer Aided Science (CASC) to promote the Atomic Energy Research and Investigation (AERI). This article reviewed achievements in R and D of grid computing technology so far obtained. (T. Tanaka)

  6. Criteria and foundations for the implementation of the Learning Resource Centers

    OpenAIRE

    Raquel Zamora Fonseca

    2013-01-01

    Review the criteria and rationale basis for the implementation of research - library and learning resource centers. The analysis focused on the implementation of CRAIs in university libraries and organizational models that can take.

  7. Criteria and foundations for the implementation of the Learning Resource Centers

    Directory of Open Access Journals (Sweden)

    Raquel Zamora Fonseca

    2013-03-01

    Full Text Available Review the criteria and rationale basis for the implementation of research - library and learning resource centers. The analysis focused on the implementation of CRAIs in university libraries and organizational models that can take.

  8. Amarillo National Resource Center for Plutonium quarterly technical progress report, August 1--October 31, 1998

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-11-01

    This paper describes activities of the Center under the following topical sections: Electronic resource library; Environmental restoration and protection; Health and safety; Waste management; Communication program; Education program; Training; Analytical development; Materials science; Plutonium processing and handling; and Storage.

  9. Amarillo National Resource Center for Plutonium. Quarterly technical progress report, February 1, 1998--April 30, 1998

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    Activities from the Amarillo National Resource Center for Plutonium are described. Areas of work include materials science of nuclear and explosive materials, plutonium processing and handling, robotics, and storage.

  10. Use of IKONOS Data for Mapping Cultural Resources of Stennis Space Center, Mississippi

    Science.gov (United States)

    Spruce, Joseph P.; Giardino, Marco

    2002-01-01

    Cultural resource surveys are important for compliance with Federal and State law. Stennis Space Center (SSC) in Mississippi is researching, developing, and validating remote sensing and Geographical Information System (GIS) methods for aiding cultural resource assessments on the center's own land. The suitability of IKONOS satellite imagery for georeferencing scanned historic maps is examined in this viewgraph presentation. IKONOS data can be used to map historic buildings and farmland in Gainsville, MS, and plan archaeological surveys.

  11. High Performance Computing in Science and Engineering '99 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2000-01-01

    The book contains reports about the most significant projects from science and engineering of the Federal High Performance Computing Center Stuttgart (HLRS). They were carefully selected in a peer-review process and are showcases of an innovative combination of state-of-the-art modeling, novel algorithms and the use of leading-edge parallel computer technology. The projects of HLRS are using supercomputer systems operated jointly by university and industry and therefore a special emphasis has been put on the industrial relevance of results and methods.

  12. High Performance Computing in Science and Engineering '98 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    1999-01-01

    The book contains reports about the most significant projects from science and industry that are using the supercomputers of the Federal High Performance Computing Center Stuttgart (HLRS). These projects are from different scientific disciplines, with a focus on engineering, physics and chemistry. They were carefully selected in a peer-review process and are showcases for an innovative combination of state-of-the-art physical modeling, novel algorithms and the use of leading-edge parallel computer technology. As HLRS is in close cooperation with industrial companies, special emphasis has been put on the industrial relevance of results and methods.

  13. New research resources at the Bloomington Drosophila Stock Center.

    Science.gov (United States)

    Cook, Kevin R; Parks, Annette L; Jacobus, Luke M; Kaufman, Thomas C; Matthews, Kathleen A

    2010-01-01

    The Bloomington Drosophila Stock Center (BDSC) is a primary source of Drosophila stocks for researchers all over the world. It houses over 27,000 unique fly lines and distributed over 160,000 samples of these stocks this past year. This report provides a brief overview of significant recent events at the BDSC with a focus on new stock sets acquired in the past year, including stocks for phiC31 transformation, RNAi knockdown of gene expression, and SNP and quantitative trait loci discovery. We also describe additions to sets of insertions and molecularly defined chromosomal deficiencies, the creation of a new Deficiency Kit, and planned additions of X chromosome duplication sets.

  14. Assessment of water resources for nuclear energy centers

    Energy Technology Data Exchange (ETDEWEB)

    Samuels, G.

    1976-09-01

    Maps of the conterminous United States showing the rivers with sufficient flow to be of interest as potential sites for nuclear energy centers are presented. These maps show the rivers with (1) mean annual flows greater than 3000 cfs, with the flow rates identified for ranges of 3000 to 6000, 6000 to 12,000, 12,000 to 24,000, and greater than 24,000 cfs; (2) monthly, 20-year low flows greater than 1500 cfs, with the flow rates identified for ranges of 1500 to 3000, 3000 to 6000, 6000 to 12,000, and greater than 12,000 cfs; and (3) annual, 20-year low flows greater than 1500 cfs, with the flow rates identified for ranges of 1500 to 3000, 3000 to 6000, 6000 to 12,000, and greater than 12,000 cfs. Criteria relating river flow rates required for various size generating stations both for sites located on reservoirs and for sites without local storage of cooling water are discussed. These criteria are used in conjunction with plant water consumption rates (based on both instantaneous peak and annual average usage rates) to estimate the installed generating capacity that may be located at one site or within a river basin. Projections of future power capacity requirements, future demand for water (both withdrawals and consumption), and regions of expected water shortages are also presented. Regional maps of water availability, based on annual, 20-year low flows, are also shown. The feasibility of locating large energy centers in these regions is discussed.

  15. Assessment of water resources for nuclear energy centers

    International Nuclear Information System (INIS)

    Samuels, G.

    1976-09-01

    Maps of the conterminous United States showing the rivers with sufficient flow to be of interest as potential sites for nuclear energy centers are presented. These maps show the rivers with (1) mean annual flows greater than 3000 cfs, with the flow rates identified for ranges of 3000 to 6000, 6000 to 12,000, 12,000 to 24,000, and greater than 24,000 cfs; (2) monthly, 20-year low flows greater than 1500 cfs, with the flow rates identified for ranges of 1500 to 3000, 3000 to 6000, 6000 to 12,000, and greater than 12,000 cfs; and (3) annual, 20-year low flows greater than 1500 cfs, with the flow rates identified for ranges of 1500 to 3000, 3000 to 6000, 6000 to 12,000, and greater than 12,000 cfs. Criteria relating river flow rates required for various size generating stations both for sites located on reservoirs and for sites without local storage of cooling water are discussed. These criteria are used in conjunction with plant water consumption rates (based on both instantaneous peak and annual average usage rates) to estimate the installed generating capacity that may be located at one site or within a river basin. Projections of future power capacity requirements, future demand for water (both withdrawals and consumption), and regions of expected water shortages are also presented. Regional maps of water availability, based on annual, 20-year low flows, are also shown. The feasibility of locating large energy centers in these regions is discussed

  16. Amarillo National Resource Center for Plutonium quarterly technical progress report, August 1, 1997--October 31, 1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This report summarizes activities of the Amarillo National Resource Center for Plutonium during the quarter. The report describes the Electronic Resource Library; DOE support activities; current and future environmental health and safety programs; pollution prevention and pollution avoidance; communication, education, training, and community involvement programs; and nuclear and other material studies, including plutonium storage and disposition studies.

  17. 34 CFR 656.1 - What is the National Resource Centers Program?

    Science.gov (United States)

    2010-07-01

    ... STUDIES OR FOREIGN LANGUAGE AND INTERNATIONAL STUDIES General § 656.1 What is the National Resource... Foreign Language and International Studies (National Resource Centers Program), the Secretary awards... international studies and the international and foreign language aspects of professional and other fields of...

  18. Strategizing for the Future: Evolving Cultural Resource Centers in Higher Education

    Science.gov (United States)

    Shek, Yen Ling

    2013-01-01

    Cultural resource centers have been an ongoing and integral component to creating a more welcoming campus climate for Students of Color since its establishment in the 1960s. While the racial dynamics may have changed, many of the challenges Students of Color faced on predominantly White campuses have not. Interestingly, cultural resource centers…

  19. 78 FR 14303 - Statement of Delegation of Authority; Health Resources and Services Administration and Centers...

    Science.gov (United States)

    2013-03-05

    ... Services Administration and Centers for Disease Control and Prevention I hereby delegate to the Administrator, Health Resources and Services Administration (HRSA), and the Director, Centers for Disease Control and Prevention (CDC), with authority to redelegate, the authority vested in the Secretary of the...

  20. Nuclear Energy Center Site Survey, 1975. Part V. Resource availability and site screening

    International Nuclear Information System (INIS)

    1976-01-01

    Resource requirements for nuclear energy centers are discussed and the large land areas which meet these requirements and may contain potential sites for a nuclear energy center (NEC) are identified. Maps of the areas are included that identify seismic zones, river flow rates, and population density

  1. Library/Media Centers in U.S. Public Schools: Growth, Staffing, and Resources. Full Report

    Science.gov (United States)

    Tuck, Kathy D.; Holmes, Dwight R.

    2016-01-01

    At the request of New Business Item: 89 (NBI: 89) adopted at the 2015 NEA Representative Assembly, this study examines the extent to which students have access to public school library/media centers with qualified staff and up-to-date resources. The study explores trends in library/media center openings and closings as well as staffing patterns…

  2. Using Language Corpora to Develop a Virtual Resource Center for Business English

    Science.gov (United States)

    Ngo, Thi Phuong Le

    2015-01-01

    A Virtual Resource Center (VRC) has been brought into use since 2008 as an integral part of a task-based language teaching and learning program for Business English courses at Nantes University, France. The objective of the center is to enable students to work autonomously and individually on their language problems so as to improve their language…

  3. National Training Center Fort Irwin expansion area aquatic resources survey

    Energy Technology Data Exchange (ETDEWEB)

    Cushing, C.E.; Mueller, R.P.

    1996-02-01

    Biologists from Pacific Northwest National Laboratory (PNNL) were requested by personnel from Fort Irwin to conduct a biological reconnaissance of the Avawatz Mountains northeast of Fort Irwin, an area for proposed expansion of the Fort. Surveys of vegetation, small mammals, birds, reptiles, amphibians, and aquatic resources were conducted during 1995 to characterize the populations and habitats present with emphasis on determining the presence of any species of special concern. This report presents a description of the sites sampled, a list of the organisms found and identified, and a discussion of relative abundance. Taxonomic identifications were done to the lowest level possible commensurate with determining the status of the taxa relative to its possible listing as a threatened, endangered, or candidate species. Consultation with taxonomic experts was undertaken for the Coleoptera ahd Hemiptera. In addition to listing the macroinvertebrates found, the authors also present a discussion related to the possible presence of any threatened or endangered species or species of concern found in Sheep Creek Springs, Tin Cabin Springs, and the Amargosa River.

  4. The MMS Science Data Center: Operations, Capabilities, and Resource.

    Science.gov (United States)

    Larsen, K. W.; Pankratz, C. K.; Giles, B. L.; Kokkonen, K.; Putnam, B.; Schafer, C.; Baker, D. N.

    2015-12-01

    The Magnetospheric MultiScale (MMS) constellation of satellites completed their six month commissioning period in August, 2015 and began science operations. Science operations for the Solving Magnetospheric Acceleration, Reconnection, and Turbulence (SMART) instrument package occur at the Laboratory for Atmospheric and Space Physics (LASP). The Science Data Center (SDC) at LASP is responsible for the data production, management, distribution, and archiving of the data received. The mission will collect several gigabytes per day of particles and field data. Management of these data requires effective selection, transmission, analysis, and storage of data in the ground segment of the mission, including efficient distribution paths to enable the science community to answer the key questions regarding magnetic reconnection. Due to the constraints on download volume, this includes the Scientist-in-the-Loop program that identifies high-value science data needed to answer the outstanding questions of magnetic reconnection. Of particular interest to the community is the tools and associated website we have developed to provide convenient access to the data, first by the mission science team and, beginning March 1, 2016, by the entire community. This presentation will demonstrate the data and tools available to the community via the SDC and discuss the technologies we chose and lessons learned.

  5. Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious

    Science.gov (United States)

    Cirasella, Jill

    2009-01-01

    This article presents an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news. All of the quantum computing resources described in this article are freely available, English-language web sites that fall into one…

  6. High Performance Computing in Science and Engineering '02 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2003-01-01

    This book presents the state-of-the-art in modeling and simulation on supercomputers. Leading German research groups present their results achieved on high-end systems of the High Performance Computing Center Stuttgart (HLRS) for the year 2002. Reports cover all fields of supercomputing simulation ranging from computational fluid dynamics to computer science. Special emphasis is given to industrially relevant applications. Moreover, by presenting results for both vector sytems and micro-processor based systems the book allows to compare performance levels and usability of a variety of supercomputer architectures. It therefore becomes an indispensable guidebook to assess the impact of the Japanese Earth Simulator project on supercomputing in the years to come.

  7. Big Data in Cloud Computing: A Resource Management Perspective

    Directory of Open Access Journals (Sweden)

    Saeed Ullah

    2018-01-01

    Full Text Available The modern day advancement is increasingly digitizing our lives which has led to a rapid growth of data. Such multidimensional datasets are precious due to the potential of unearthing new knowledge and developing decision-making insights from them. Analyzing this huge amount of data from multiple sources can help organizations to plan for the future and anticipate changing market trends and customer requirements. While the Hadoop framework is a popular platform for processing larger datasets, there are a number of other computing infrastructures, available to use in various application domains. The primary focus of the study is how to classify major big data resource management systems in the context of cloud computing environment. We identify some key features which characterize big data frameworks as well as their associated challenges and issues. We use various evaluation metrics from different aspects to identify usage scenarios of these platforms. The study came up with some interesting findings which are in contradiction with the available literature on the Internet.

  8. Mobilizing Learning Resources in a Transnational Classroom: Translocal and Digital Resources in a Community Technology Center

    Science.gov (United States)

    Noguerón-Liu, Silvia

    2014-01-01

    Drawing from transnational and activity theory frameworks, this study analyzes the ways translocal flows shape learning in a community technology center serving adult immigrants in the US Southwest. It also explores students' constructions of the transnational nature of the courses they took, where they had access to both online and face-to-face…

  9. Development of a computer system at La Hague center

    International Nuclear Information System (INIS)

    Mimaud, Robert; Malet, Georges; Ollivier, Francis; Fabre, J.-C.; Valois, Philippe; Desgranges, Patrick; Anfossi, Gilbert; Gentizon, Michel; Serpollet, Roger.

    1977-01-01

    The U.P.2 plant, built at La Hague Center is intended mainly for the reprocessing of spent fuels coming from (as metal) graphite-gas reactors and (as oxide) light-water, heavy-water and breeder reactors. In each of the five large nuclear units the digital processing of measurements was dealt with until 1974 by CAE 3030 data processors. During the period 1974-1975 a modern industrial computer system was set up. This system, equipped with T 2000/20 material from the Telemecanique company, consists of five measurement acquisition devices (for a total of 1500 lines processed) and two central processing units (CPU). The connection of these two PCU (Hardware and Software) enables an automatic connection of the system either on the first CPU or on the second one. The system covers, at present, data processing, threshold monitoring, alarm systems, display devices, periodical listing, and specific calculations concerning the process (balances etc), and at a later stage, an automatic control of certain units of the Process [fr

  10. A Review of Computer Science Resources for Learning and Teaching with K-12 Computing Curricula: An Australian Case Study

    Science.gov (United States)

    Falkner, Katrina; Vivian, Rebecca

    2015-01-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age…

  11. Geriatric resources in acute care hospitals and trauma centers: a scarce commodity.

    Science.gov (United States)

    Maxwell, Cathy A; Mion, Lorraine C; Minnick, Ann

    2013-12-01

    The number of older adults admitted to acute care hospitals with traumatic injury is rising. The purpose of this study was to examine the location of five prominent geriatric resource programs in U.S. acute care hospitals and trauma centers (N = 4,865). As of 2010, 5.8% of all U.S. hospitals had at least one of these programs. Only 8.8% of trauma centers were served by at least one program; the majorities were in level I trauma centers. Slow adoption of geriatric resource programs in hospitals may be due to lack of champions who will advocate for these programs, lack of evidence of their impact on outcomes, or lack of a business plan to support adoption. Future studies should focus on the benefits of geriatric resource programs from patients' perspectives, as well as from business case and outcomes perspectives. Copyright 2013, SLACK Incorporated.

  12. M-center growth in alkali halides: computer simulation

    International Nuclear Information System (INIS)

    Aguilar, M.; Jaque, F.; Agullo-Lopez, F.

    1983-01-01

    The heterogeneous interstitial nucleation model previously proposed to explain F-center growth curves in irradiated alkali halides has been extended to account for M-center kinetics. The interstitials produced during the primary irradiation event are assumed to be trapped at impurities and interstitial clusters or recombine with F and M centers. For M-center formation two cases have been considered: (a) diffusion and aggregation of F centers, and (b) statistical generation and pairing of F centers. Process (b) is the only one consistent with the quadratic relationship between M and F center concentrations. However, to account for the F/M ratios experimentally observed as well as for the role of dose-rate, a modified statistical model involving random creation and association of F + -F pairs has been shown to be adequate. (author)

  13. Client/server models for transparent, distributed computational resources

    International Nuclear Information System (INIS)

    Hammer, K.E.; Gilman, T.L.

    1991-01-01

    Client/server models are proposed to address issues of shared resources in a distributed, heterogeneous UNIX environment. Recent development of automated Remote Procedure Call (RPC) interface generator has simplified the development of client/server models. Previously, implementation of the models was only possible at the UNIX socket level. An overview of RPCs and the interface generator will be presented and will include a discussion of generation and installation of remote services, the RPC paradigm, and the three levels of RPC programming. Two applications, the Nuclear Plant Analyzer (NPA) and a fluids simulation using molecular modelling, will be presented to demonstrate how client/server models using RPCs and External Data Representations (XDR) have been used production/computation situations. The NPA incorporates a client/server interface for transferring/translation of TRAC or RELAP results from the UNICOS Cray to a UNIX workstation. The fluids simulation program utilizes the client/server model to access the Cray via a single function allowing it to become a shared co-processor to the workstation application. 5 refs., 6 figs

  14. Application of Selective Algorithm for Effective Resource Provisioning in Cloud Computing Environment

    OpenAIRE

    Katyal, Mayanka; Mishra, Atul

    2014-01-01

    Modern day continued demand for resource hungry services and applications in IT sector has led to development of Cloud computing. Cloud computing environment involves high cost infrastructure on one hand and need high scale computational resources on the other hand. These resources need to be provisioned (allocation and scheduling) to the end users in most efficient manner so that the tremendous capabilities of cloud are utilized effectively and efficiently. In this paper we discuss a selecti...

  15. Turnover intentions in a call center: The role of emotional dissonance, job resources, and job satisfaction.

    Directory of Open Access Journals (Sweden)

    Margherita Zito

    Full Text Available Turnover intentions refer to employees' intent to leave the organization and, within call centers, it can be influenced by factors such as relational variables or the perception of the quality of working life, which can be affected by emotional dissonance. This specific job demand to express emotions not felt is peculiar in call centers, and can influence job satisfaction and turnover intentions, a crucial problem among these working contexts. This study aims to detect, within the theoretical framework of the Job Demands-Resources Model, the role of emotional dissonance (job demand, and two resources, job autonomy and supervisors' support, in the perception of job satisfaction and turnover intentions among an Italian call center.The study involved 318 call center agents of an Italian Telecommunication Company. Data analysis first performed descriptive statistics through SPSS 22. A path analysis was then performed through LISREL 8.72 and tested both direct and indirect effects.Results suggest the role of resources in fostering job satisfaction and in decreasing turnover intentions. Emotional dissonance reveals a negative relation with job satisfaction and a positive relation with turnover. Moreover, job satisfaction is negatively related with turnover and mediates the relationship between job resources and turnover.This study contributes to extend the knowledge about the variables influencing turnover intentions, a crucial problem among call centers. Moreover, the study identifies theoretical considerations and practical implications to promote well-being among call center employees. To foster job satisfaction and reduce turnover intentions, in fact, it is important to make resources available, but also to offer specific training programs to make employees and supervisors aware about the consequences of emotional dissonance.

  16. Turnover intentions in a call center: The role of emotional dissonance, job resources, and job satisfaction

    Science.gov (United States)

    Zito, Margherita; Molino, Monica; Cortese, Claudio Giovanni; Ghislieri, Chiara; Colombo, Lara

    2018-01-01

    Background Turnover intentions refer to employees’ intent to leave the organization and, within call centers, it can be influenced by factors such as relational variables or the perception of the quality of working life, which can be affected by emotional dissonance. This specific job demand to express emotions not felt is peculiar in call centers, and can influence job satisfaction and turnover intentions, a crucial problem among these working contexts. This study aims to detect, within the theoretical framework of the Job Demands-Resources Model, the role of emotional dissonance (job demand), and two resources, job autonomy and supervisors’ support, in the perception of job satisfaction and turnover intentions among an Italian call center. Method The study involved 318 call center agents of an Italian Telecommunication Company. Data analysis first performed descriptive statistics through SPSS 22. A path analysis was then performed through LISREL 8.72 and tested both direct and indirect effects. Results Results suggest the role of resources in fostering job satisfaction and in decreasing turnover intentions. Emotional dissonance reveals a negative relation with job satisfaction and a positive relation with turnover. Moreover, job satisfaction is negatively related with turnover and mediates the relationship between job resources and turnover. Conclusion This study contributes to extend the knowledge about the variables influencing turnover intentions, a crucial problem among call centers. Moreover, the study identifies theoretical considerations and practical implications to promote well-being among call center employees. To foster job satisfaction and reduce turnover intentions, in fact, it is important to make resources available, but also to offer specific training programs to make employees and supervisors aware about the consequences of emotional dissonance. PMID:29401507

  17. Turnover intentions in a call center: The role of emotional dissonance, job resources, and job satisfaction.

    Science.gov (United States)

    Zito, Margherita; Emanuel, Federica; Molino, Monica; Cortese, Claudio Giovanni; Ghislieri, Chiara; Colombo, Lara

    2018-01-01

    Turnover intentions refer to employees' intent to leave the organization and, within call centers, it can be influenced by factors such as relational variables or the perception of the quality of working life, which can be affected by emotional dissonance. This specific job demand to express emotions not felt is peculiar in call centers, and can influence job satisfaction and turnover intentions, a crucial problem among these working contexts. This study aims to detect, within the theoretical framework of the Job Demands-Resources Model, the role of emotional dissonance (job demand), and two resources, job autonomy and supervisors' support, in the perception of job satisfaction and turnover intentions among an Italian call center. The study involved 318 call center agents of an Italian Telecommunication Company. Data analysis first performed descriptive statistics through SPSS 22. A path analysis was then performed through LISREL 8.72 and tested both direct and indirect effects. Results suggest the role of resources in fostering job satisfaction and in decreasing turnover intentions. Emotional dissonance reveals a negative relation with job satisfaction and a positive relation with turnover. Moreover, job satisfaction is negatively related with turnover and mediates the relationship between job resources and turnover. This study contributes to extend the knowledge about the variables influencing turnover intentions, a crucial problem among call centers. Moreover, the study identifies theoretical considerations and practical implications to promote well-being among call center employees. To foster job satisfaction and reduce turnover intentions, in fact, it is important to make resources available, but also to offer specific training programs to make employees and supervisors aware about the consequences of emotional dissonance.

  18. A study of computer graphics technology in application of communication resource management

    Science.gov (United States)

    Li, Jing; Zhou, Liang; Yang, Fei

    2017-08-01

    With the development of computer technology, computer graphics technology has been widely used. Especially, the success of object-oriented technology and multimedia technology promotes the development of graphics technology in the computer software system. Therefore, the computer graphics theory and application technology have become an important topic in the field of computer, while the computer graphics technology becomes more and more extensive in various fields of application. In recent years, with the development of social economy, especially the rapid development of information technology, the traditional way of communication resource management cannot effectively meet the needs of resource management. In this case, the current communication resource management is still using the original management tools and management methods, resource management equipment management and maintenance, which brought a lot of problems. It is very difficult for non-professionals to understand the equipment and the situation in communication resource management. Resource utilization is relatively low, and managers cannot quickly and accurately understand the resource conditions. Aimed at the above problems, this paper proposes to introduce computer graphics technology into the communication resource management. The introduction of computer graphics not only makes communication resource management more vivid, but also reduces the cost of resource management and improves work efficiency.

  19. An Audit on the Appropriateness of Coronary Computed Tomography Angiography Referrals in a Tertiary Cardiac Center.

    Science.gov (United States)

    Alderazi, Ahmed Ali; Lynch, Mary

    2017-01-01

    In response to growing concerns regarding the overuse of coronary computed tomography angiography (CCTA) in the clinical setting, multiple societies, including the American College of Cardiology Foundation, have jointly published revised criteria regarding the appropriate use of this imaging modality. However, previous research indicates significant discrepancies in the rate of adherence to these guidelines. To assess the appropriateness of CCTA referrals in a tertiary cardiac center in Bahrain. This retrospective clinical audit examined the records of patients referred to CCTA between the April 1, 2015 and December 31, 2015 in Mohammed bin Khalifa Cardiac Center. Using information from medical records, each case was meticulously audited against guidelines to categorize it as appropriate, inappropriate, or uncertain. Of the 234 records examined, 176 (75.2%) were appropriate, 47 (20.1%) were uncertain, and 11 (4.7%) were inappropriate. About 74.4% of all referrals were to investigate coronary artery disease (CAD). The most common indication that was deemed appropriate was the detection of CAD in the setting of suspected ischemic equivalent in patients with an intermediate pretest probability of CAD (65.9%). Most referrals deemed inappropriate were requested to detect CAD in asymptomatic patients at low or intermediate risk of CAD (63.6%). This audit demonstrates a relatively low rate of inappropriate CCTA referrals, indicating the appropriate and efficient use of this resource in the Mohammed bin Khalifa Cardiac Center. Agreement on and reclassification of "uncertain" cases by guideline authorities would facilitate a deeper understanding of referral appropriateness.

  20. Research on elastic resource management for multi-queue under cloud computing environment

    Science.gov (United States)

    CHENG, Zhenjing; LI, Haibo; HUANG, Qiulan; Cheng, Yaodong; CHEN, Gang

    2017-10-01

    As a new approach to manage computing resource, virtualization technology is more and more widely applied in the high-energy physics field. A virtual computing cluster based on Openstack was built at IHEP, using HTCondor as the job queue management system. In a traditional static cluster, a fixed number of virtual machines are pre-allocated to the job queue of different experiments. However this method cannot be well adapted to the volatility of computing resource requirements. To solve this problem, an elastic computing resource management system under cloud computing environment has been designed. This system performs unified management of virtual computing nodes on the basis of job queue in HTCondor based on dual resource thresholds as well as the quota service. A two-stage pool is designed to improve the efficiency of resource pool expansion. This paper will present several use cases of the elastic resource management system in IHEPCloud. The practical run shows virtual computing resource dynamically expanded or shrunk while computing requirements change. Additionally, the CPU utilization ratio of computing resource was significantly increased when compared with traditional resource management. The system also has good performance when there are multiple condor schedulers and multiple job queues.

  1. SYSTEMATIC LITERATURE REVIEW ON RESOURCE ALLOCATION AND RESOURCE SCHEDULING IN CLOUD COMPUTING

    OpenAIRE

    B. Muni Lavanya; C. Shoba Bindu

    2016-01-01

    The objective the work is intended to highlight the key features and afford finest future directions in the research community of Resource Allocation, Resource Scheduling and Resource management from 2009 to 2016. Exemplifying how research on Resource Allocation, Resource Scheduling and Resource management has progressively increased in the past decade by inspecting articles, papers from scientific and standard publications. Survey materialized in three-fold process. Firstly, investigate on t...

  2. Cloud Computing and Information Technology Resource Cost Management for SMEs

    DEFF Research Database (Denmark)

    Kuada, Eric; Adanu, Kwame; Olesen, Henning

    2013-01-01

    This paper analyzes the decision-making problem confronting SMEs considering the adoption of cloud computing as an alternative to in-house computing services provision. The economics of choosing between in-house computing and a cloud alternative is analyzed by comparing the total economic costs...... in determining the relative value of cloud computing....

  3. A review of Computer Science resources for learning and teaching with K-12 computing curricula: an Australian case study

    Science.gov (United States)

    Falkner, Katrina; Vivian, Rebecca

    2015-10-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age children, with the intention to engage children and increase interest, rather than to formally teach concepts and skills. What is the educational quality of existing Computer Science resources and to what extent are they suitable for classroom learning and teaching? In this paper, an assessment framework is presented to evaluate the quality of online Computer Science resources. Further, a semi-systematic review of available online Computer Science resources was conducted to evaluate resources available for classroom learning and teaching and to identify gaps in resource availability, using the Australian curriculum as a case study analysis. The findings reveal a predominance of quality resources, however, a number of critical gaps were identified. This paper provides recommendations and guidance for the development of new and supplementary resources and future research.

  4. 15 CFR 291.4 - National industry-specific pollution prevention and environmental compliance resource centers.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 1 2010-01-01 2010-01-01 false National industry-specific pollution prevention and environmental compliance resource centers. 291.4 Section 291.4 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAM...

  5. Automated Library Networking in American Public Community College Learning Resources Centers.

    Science.gov (United States)

    Miah, Adbul J.

    1994-01-01

    Discusses the need for community colleges to assess their participation in automated library networking systems (ALNs). Presents results of questionnaires sent to 253 community college learning resource center directors to determine their use of ALNs. Reviews benefits of automation and ALN activities, planning and communications, institution size,…

  6. Measuring Malaysia School Resource Centers' Standards through iQ-PSS: An Online Management Information System

    Science.gov (United States)

    Zainudin, Fadzliaton; Ismail, Kamarulzaman

    2010-01-01

    The Ministry of Education has come up with an innovative way to monitor the progress of 9,843 School Resource Centers (SRCs) using an online management information system called iQ-PSS (Quality Index of SRC). This paper aims to describe the data collection method and analyze the current state of SRCs in Malaysia and explain how the results can be…

  7. Expanding the Intellectual Property Knowledge Base at University Libraries: Collaborating with Patent and Trademark Resource Centers

    Science.gov (United States)

    Wallace, Martin; Reinman, Suzanne

    2018-01-01

    Patent and Trademark Resource Centers are located in libraries throughout the U.S., with 43 being in academic libraries. With the importance of incorporating a knowledge of intellectual property (IP) and patent research in university curricula nationwide, this study developed and evaluated a partnership program to increase the understanding of IP…

  8. Study on Cloud Computing Resource Scheduling Strategy Based on the Ant Colony Optimization Algorithm

    OpenAIRE

    Lingna He; Qingshui Li; Linan Zhu

    2012-01-01

    In order to replace the traditional Internet software usage patterns and enterprise management mode, this paper proposes a new business calculation mode- cloud computing, resources scheduling strategy is the key technology in cloud computing, Based on the study of cloud computing system structure and the mode of operation, The key research for cloud computing the process of the work scheduling and resource allocation problems based on ant colony algorithm , Detailed analysis and design of the...

  9. Zebrafish Health Conditions in the China Zebrafish Resource Center and 20 Major Chinese Zebrafish Laboratories.

    Science.gov (United States)

    Liu, Liyue; Pan, Luyuan; Li, Kuoyu; Zhang, Yun; Zhu, Zuoyan; Sun, Yonghua

    2016-07-01

    In China, the use of zebrafish as an experimental animal in the past 15 years has widely expanded. The China Zebrafish Resource Center (CZRC), which was established in 2012, is becoming one of the major resource centers in the global zebrafish community. Large-scale use and regular exchange of zebrafish resources have put forward higher requirements on zebrafish health issues in China. This article reports the current aquatic infrastructure design, animal husbandry, and health-monitoring programs in the CZRC. Meanwhile, through a survey of 20 Chinese zebrafish laboratories, we also describe the current health status of major zebrafish facilities in China. We conclude that it is of great importance to establish a widely accepted health standard and health-monitoring strategy in the Chinese zebrafish research community.

  10. Fluid dynamics parallel computer development at NASA Langley Research Center

    Science.gov (United States)

    Townsend, James C.; Zang, Thomas A.; Dwoyer, Douglas L.

    1987-01-01

    To accomplish more detailed simulations of highly complex flows, such as the transition to turbulence, fluid dynamics research requires computers much more powerful than any available today. Only parallel processing on multiple-processor computers offers hope for achieving the required effective speeds. Looking ahead to the use of these machines, the fluid dynamicist faces three issues: algorithm development for near-term parallel computers, architecture development for future computer power increases, and assessment of possible advantages of special purpose designs. Two projects at NASA Langley address these issues. Software development and algorithm exploration is being done on the FLEX/32 Parallel Processing Research Computer. New architecture features are being explored in the special purpose hardware design of the Navier-Stokes Computer. These projects are complementary and are producing promising results.

  11. Center for computation and visualization of geometric structures. [Annual], Progress report

    Energy Technology Data Exchange (ETDEWEB)

    1993-02-12

    The mission of the Center is to establish a unified environment promoting research, education, and software and tool development. The work is centered on computing, interpreted in a broad sense to include the relevant theory, development of algorithms, and actual implementation. The research aspects of the Center are focused on geometry; correspondingly the computational aspects are focused on three (and higher) dimensional visualization. The educational aspects are likewise centered on computing and focused on geometry. A broader term than education is `communication` which encompasses the challenge of explaining to the world current research in mathematics, and specifically geometry.

  12. Discovery of resources using MADM approaches for parallel and distributed computing

    Directory of Open Access Journals (Sweden)

    Mandeep Kaur

    2017-06-01

    Full Text Available Grid, a form of parallel and distributed computing, allows the sharing of data and computational resources among its users from various geographical locations. The grid resources are diverse in terms of their underlying attributes. The majority of the state-of-the-art resource discovery techniques rely on the static resource attributes during resource selection. However, the matching resources based on the static resource attributes may not be the most appropriate resources for the execution of user applications because they may have heavy job loads, less storage space or less working memory (RAM. Hence, there is a need to consider the current state of the resources in order to find the most suitable resources. In this paper, we have proposed a two-phased multi-attribute decision making (MADM approach for discovery of grid resources by using P2P formalism. The proposed approach considers multiple resource attributes for decision making of resource selection and provides the best suitable resource(s to grid users. The first phase describes a mechanism to discover all matching resources and applies SAW method to shortlist the top ranked resources, which are communicated to the requesting super-peer. The second phase of our proposed methodology applies integrated MADM approach (AHP enriched PROMETHEE-II on the list of selected resources received from different super-peers. The pairwise comparison of the resources with respect to their attributes is made and the rank of each resource is determined. The top ranked resource is then communicated to the grid user by the grid scheduler. Our proposed methodology enables the grid scheduler to allocate the most suitable resource to the user application and also reduces the search complexity by filtering out the less suitable resources during resource discovery.

  13. Evaluation of health resource utilization efficiency in community health centers of Jiangsu Province, China.

    Science.gov (United States)

    Xu, Xinglong; Zhou, Lulin; Antwi, Henry Asante; Chen, Xi

    2018-02-20

    While the demand for health services keep escalating at the grass roots or rural areas of China, a substantial portion of healthcare resources remain stagnant in the more developed cities and this has entrenched health inequity in many parts of China. At its conception, China's Deepen Medical Reform started in 2012 was intended to flush out possible disparities and promote a more equitable and efficient distribution of healthcare resources. Nearly half a decade of this reform, there are uncertainties as to whether the attainment of the objectives of the reform is in sight. Using a hybrid of panel data analysis and an augmented data envelopment analysis (DEA), we model human resources, material, finance to determine their technical and scale efficiency to comprehensively evaluate the transverse and longitudinal allocation efficiency of community health resources in Jiangsu Province. We observed that the Deepen Medical Reform in China has led to an increase concern to ensure efficient allocation of community health resources by health policy makers in the province. This has led to greater efficiency in health resource allocation in Jiangsu in general but serious regional or municipal disparities still exist. Using the DEA model, we note that the output from the Community Health Centers does not commensurate with the substantial resources (human resources, materials, and financial) invested in them. We further observe that the case is worst in less-developed Northern parts of Jiangsu Province. The government of Jiangsu Province could improve the efficiency of health resource allocation by improving the community health service system, rationalizing the allocation of health personnel, optimizing the allocation of material resources, and enhancing the level of health of financial resource allocation.

  14. An interactive computer approach to performing resource analysis for a multi-resource/multi-project problem. [Spacelab inventory procurement planning

    Science.gov (United States)

    Schlagheck, R. A.

    1977-01-01

    New planning techniques and supporting computer tools are needed for the optimization of resources and costs for space transportation and payload systems. Heavy emphasis on cost effective utilization of resources has caused NASA program planners to look at the impact of various independent variables that affect procurement buying. A description is presented of a category of resource planning which deals with Spacelab inventory procurement analysis. Spacelab is a joint payload project between NASA and the European Space Agency and will be flown aboard the Space Shuttle starting in 1980. In order to respond rapidly to the various procurement planning exercises, a system was built that could perform resource analysis in a quick and efficient manner. This system is known as the Interactive Resource Utilization Program (IRUP). Attention is given to aspects of problem definition, an IRUP system description, questions of data base entry, the approach used for project scheduling, and problems of resource allocation.

  15. Center for Advanced Energy Studies: Computer Assisted Virtual Environment (CAVE)

    Data.gov (United States)

    Federal Laboratory Consortium — The laboratory contains a four-walled 3D computer assisted virtual environment - or CAVE TM — that allows scientists and engineers to literally walk into their data...

  16. Diamond NV centers for quantum computing and quantum networks

    NARCIS (Netherlands)

    Childress, L.; Hanson, R.

    2013-01-01

    The exotic features of quantum mechanics have the potential to revolutionize information technologies. Using superposition and entanglement, a quantum processor could efficiently tackle problems inaccessible to current-day computers. Nonlocal correlations may be exploited for intrinsically secure

  17. Establishing a health outcomes and economics center in radiology: strategies and resources required

    International Nuclear Information System (INIS)

    Medina, Santiago L.; Altman, Nolan R.

    2002-01-01

    To describe the resources and strategies required to establish a health outcomes and economics center in radiology.Methods. Human and nonhuman resources required to perform sound outcomes and economics studies in radiology are reviewed.Results. Human resources needed include skilled medical and nonmedical staff. Nonhuman resources required are: (1) communication and information network; (2) education tools and training programs; (3) budgetary strategies; and (4) sources of income. Effective utilization of these resources allows the performance of robust operational and clinical research projects in decision analysis, cost-effectiveness, diagnostic performance (sensitivity, specificity, and ROC curves), and clinical analytical and experimental studies.Conclusion. As new radiologic technology and techniques are introduced in medicine, society is increasingly demanding sound clinical studies that will determine the impact of radiologic studies on patient outcome. Health-care funding is scarce, and therefore third-party payers and hospitals are demanding more efficiency and productivity from radiologic service providers. To meet these challenges, radiology departments could establish health outcomes and economics centers to study the clinical effectiveness of imaging and its impact on patient outcome. (orig.)

  18. Resource-Aware Load Balancing Scheme using Multi-objective Optimization in Cloud Computing

    OpenAIRE

    Kavita Rana; Vikas Zandu

    2016-01-01

    Cloud computing is a service based, on-demand, pay per use model consisting of an interconnected and virtualizes resources delivered over internet. In cloud computing, usually there are number of jobs that need to be executed with the available resources to achieve optimal performance, least possible total time for completion, shortest response time, and efficient utilization of resources etc. Hence, job scheduling is the most important concern that aims to ensure that use’s requirement are ...

  19. Western Mineral and Environmental Resources Science Center--providing comprehensive earth science for complex societal issues

    Science.gov (United States)

    Frank, David G.; Wallace, Alan R.; Schneider, Jill L.

    2010-01-01

    Minerals in the environment and products manufactured from mineral materials are all around us and we use and come into contact with them every day. They impact our way of life and the health of all that lives. Minerals are critical to the Nation's economy and knowing where future mineral resources will come from is important for sustaining the Nation's economy and national security. The U.S. Geological Survey (USGS) Mineral Resources Program (MRP) provides scientific information for objective resource assessments and unbiased research results on mineral resource potential, production and consumption statistics, as well as environmental consequences of mining. The MRP conducts this research to provide information needed for land planners and decisionmakers about where mineral commodities are known and suspected in the earth's crust and about the environmental consequences of extracting those commodities. As part of the MRP scientists of the Western Mineral and Environmental Resources Science Center (WMERSC or 'Center' herein) coordinate the development of national, geologic, geochemical, geophysical, and mineral-resource databases and the migration of existing databases to standard models and formats that are available to both internal and external users. The unique expertise developed by Center scientists over many decades in response to mineral-resource-related issues is now in great demand to support applications such as public health research and remediation of environmental hazards that result from mining and mining-related activities. Western Mineral and Environmental Resources Science Center Results of WMERSC research provide timely and unbiased analyses of minerals and inorganic materials to (1) improve stewardship of public lands and resources; (2) support national and international economic and security policies; (3) sustain prosperity and improve our quality of life; and (4) protect and improve public health, safety, and environmental quality. The MRP

  20. Research on the digital education resources of sharing pattern in independent colleges based on cloud computing environment

    Science.gov (United States)

    Xiong, Ting; He, Zhiwen

    2017-06-01

    Cloud computing was first proposed by Google Company in the United States, which was based on the Internet center, providing a standard and open network sharing service approach. With the rapid development of the higher education in China, the educational resources provided by colleges and universities had greatly gap in the actual needs of teaching resources. therefore, Cloud computing of using the Internet technology to provide shared methods liked the timely rain, which had become an important means of the Digital Education on sharing applications in the current higher education. Based on Cloud computing environment, the paper analyzed the existing problems about the sharing of digital educational resources in Jiangxi Province Independent Colleges. According to the sharing characteristics of mass storage, efficient operation and low input about Cloud computing, the author explored and studied the design of the sharing model about the digital educational resources of higher education in Independent College. Finally, the design of the shared model was put into the practical applications.

  1. Impact of changing computer technology on hydrologic and water resource modeling

    OpenAIRE

    Loucks, D.P.; Fedra, K.

    1987-01-01

    The increasing availability of substantial computer power at relatively low costs and the increasing ease of using computer graphics, of communicating with other computers and data bases, and of programming using high-level problem-oriented computer languages, is providing new opportunities and challenges for those developing and using hydrologic and water resources models. This paper reviews some of the progress made towards the development and application of computer support systems designe...

  2. LHCb Computing Resources: 2011 re-assessment, 2012 request and 2013 forecast

    CERN Document Server

    Graciani, R

    2011-01-01

    This note covers the following aspects: re-assessment of computing resource usage estimates for 2011 data taking period, request of computing resource needs for 2012 data taking period and a first forecast of the 2013 needs, when no data taking is foreseen. Estimates are based on 2010 experienced and last updates from LHC schedule, as well as on a new implementation of the computing model simulation tool. Differences in the model and deviations in the estimates from previous presented results are stressed.

  3. LHCb Computing Resources: 2012 re-assessment, 2013 request and 2014 forecast

    CERN Document Server

    Graciani Diaz, Ricardo

    2012-01-01

    This note covers the following aspects: re-assessment of computing resource usage estimates for 2012 data-taking period, request of computing resource needs for 2013, and a first forecast of the 2014 needs, when restart of data-taking is foreseen. Estimates are based on 2011 experience, as well as on the results of a simulation of the computing model described in the document. Differences in the model and deviations in the estimates from previous presented results are stressed.

  4. Establishing Network Interaction between Resource Training Centers for People with Disabilities and Partner Universities

    Directory of Open Access Journals (Sweden)

    Panyukova S.V.,

    2018-05-01

    Full Text Available The paper focuses on the problem of accessibility and quality of higher education for students with disabilities. We describe our experience in organising network interaction between the MSUPE Resource and Training Center for Disabled People established in 2016-2017 and partner universities in ‘fixed territories’. The need for cooperation and network interaction arises from the high demand for the cooperation of efforts of leading experts, researchers, methodologists and instructors necessary for improving the quality and accessibility of higher education for persons with disabilities. The Resource and Training Center offers counseling for the partner universities, arranges advanced training for those responsible for teaching of the disabled, and offers specialized equipment for temporary use. In this article, we emphasize the importance of organizing network interactions with universities and social partners in order to ensure accessibility of higher education for students with disabilities.

  5. FORMATION OF TEACHERS-TUTOR ICT COMPETENCE OF DISTANCE EDUCATION RESOURCE CENTER

    Directory of Open Access Journals (Sweden)

    Olga E. Konevchshynska

    2014-09-01

    Full Text Available The paper analyzes the main approaches to the definition of ICT competence of professionals who provide training and methodological support of distance learning. There is highlighted the level of scientific development of the problem, identified and proved the essence of teacher’s ICT competence, overviewed the international and domestic experience of teacher training in the the sphere of information technologies. It is indicated that one of the main tasks of resource centers for distance education is the provision of an appropriate level of qualification of teacher-tutor working in a network of resource centers. Also it is pointed out the levels of ICT competencies necessary for successful professional activity of network teachers.

  6. Science and Technology Resources on the Internet: Computer Security.

    Science.gov (United States)

    Kinkus, Jane F.

    2002-01-01

    Discusses issues related to computer security, including confidentiality, integrity, and authentication or availability; and presents a selected list of Web sites that cover the basic issues of computer security under subject headings that include ethics, privacy, kids, antivirus, policies, cryptography, operating system security, and biometrics.…

  7. Activities and experience of the Federal Resource Center for Organizing Comprehensive Support for Children with ASD

    Directory of Open Access Journals (Sweden)

    Khaustov A.V.

    2016-12-01

    Full Text Available This article presents basic activities and experience of the Federal Resource Center for Organizing Comprehensive Sup¬port for Children with ASD of Moscow state university of psychology & education, amassed during 22 years of practice. Some statistic data on the center’s activity are displayed. Emphasis is done on multidirectional work and developing ways of interdepartmental and networking interaction for the sake of founding a system of complex support for autistic children in Russian Federation.

  8. Evaluation of a fungal collection as certified reference material producer and as a biological resource center

    Directory of Open Access Journals (Sweden)

    Tatiana Forti

    2016-06-01

    Full Text Available Abstract Considering the absence of standards for culture collections and more specifically for biological resource centers in the world, in addition to the absence of certified biological material in Brazil, this study aimed to evaluate a Fungal Collection from Fiocruz, as a producer of certified reference material and as Biological Resource Center (BRC. For this evaluation, a checklist based on the requirements of ABNT ISO GUIA34:2012 correlated with the ABNT NBR ISO/IEC17025:2005, was designed and applied. Complementing the implementation of the checklist, an internal audit was performed. An evaluation of this Collection as a BRC was also conducted following the requirements of the NIT-DICLA-061, the Brazilian internal standard from Inmetro, based on ABNT NBR ISO/IEC 17025:2005, ABNT ISO GUIA 34:2012 and OECD Best Practice Guidelines for BRCs. This was the first time that the NIT DICLA-061 was applied in a culture collection during an internal audit. The assessments enabled the proposal for the adequacy of this Collection to assure the implementation of the management system for their future accreditation by Inmetro as a certified reference material producer as well as its future accreditation as a Biological Resource Center according to the NIT-DICLA-061.

  9. Evaluation of a fungal collection as certified reference material producer and as a biological resource center.

    Science.gov (United States)

    Forti, Tatiana; Souto, Aline da S S; do Nascimento, Carlos Roberto S; Nishikawa, Marilia M; Hubner, Marise T W; Sabagh, Fernanda P; Temporal, Rosane Maria; Rodrigues, Janaína M; da Silva, Manuela

    2016-01-01

    Considering the absence of standards for culture collections and more specifically for biological resource centers in the world, in addition to the absence of certified biological material in Brazil, this study aimed to evaluate a Fungal Collection from Fiocruz, as a producer of certified reference material and as Biological Resource Center (BRC). For this evaluation, a checklist based on the requirements of ABNT ISO GUIA34:2012 correlated with the ABNT NBR ISO/IEC17025:2005, was designed and applied. Complementing the implementation of the checklist, an internal audit was performed. An evaluation of this Collection as a BRC was also conducted following the requirements of the NIT-DICLA-061, the Brazilian internal standard from Inmetro, based on ABNT NBR ISO/IEC 17025:2005, ABNT ISO GUIA 34:2012 and OECD Best Practice Guidelines for BRCs. This was the first time that the NIT DICLA-061 was applied in a culture collection during an internal audit. The assessments enabled the proposal for the adequacy of this Collection to assure the implementation of the management system for their future accreditation by Inmetro as a certified reference material producer as well as its future accreditation as a Biological Resource Center according to the NIT-DICLA-061. Copyright © 2016 Sociedade Brasileira de Microbiologia. Published by Elsevier Editora Ltda. All rights reserved.

  10. National Energy Research Scientific Computing Center 2007 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Hules, John A.; Bashor, Jon; Wang, Ucilia; Yarris, Lynn; Preuss, Paul

    2008-10-23

    This report presents highlights of the research conducted on NERSC computers in a variety of scientific disciplines during the year 2007. It also reports on changes and upgrades to NERSC's systems and services aswell as activities of NERSC staff.

  11. Computer modeling with randomized-controlled trial data informs the development of person-centered aged care homes.

    Science.gov (United States)

    Chenoweth, Lynn; Vickland, Victor; Stein-Parbury, Jane; Jeon, Yun-Hee; Kenny, Patricia; Brodaty, Henry

    2015-10-01

    To answer questions on the essential components (services, operations and resources) of a person-centered aged care home (iHome) using computer simulation. iHome was developed with AnyLogic software using extant study data obtained from 60 Australian aged care homes, 900+ clients and 700+ aged care staff. Bayesian analysis of simulated trial data will determine the influence of different iHome characteristics on care service quality and client outcomes. Interim results: A person-centered aged care home (socio-cultural context) and care/lifestyle services (interactional environment) can produce positive outcomes for aged care clients (subjective experiences) in the simulated environment. Further testing will define essential characteristics of a person-centered care home.

  12. Renewable Resources: a national catalog of model projects. Volume 1. Northeast Solar Energy Center Region

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-07-01

    This compilation of diverse conservation and renewable energy projects across the United States was prepared through the enthusiastic participation of solar and alternate energy groups from every state and region. Compiled and edited by the Center for Renewable Resources, these projects reflect many levels of innovation and technical expertise. In many cases, a critique analysis is presented of how projects performed and of the institutional conditions associated with their success or failure. Some 2000 projects are included in this compilation; most have worked, some have not. Information about all is presented to aid learning from these experiences. The four volumes in this set are arranged in state sections by geographic region, coinciding with the four Regional Solar Energy Centers. The table of contents is organized by project category so that maximum cross-referencing may be obtained. This volume includes information on the Northeast Solar Energy Center Region. (WHK).

  13. Renewable Resources: a national catalog of model projects. Volume 3. Southern Solar Energy Center Region

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-07-01

    This compilation of diverse conservation and renewable energy projects across the United States was prepared through the enthusiastic participation of solar and alternate energy groups from every state and region. Compiled and edited by the Center for Renewable Resources, these projects reflect many levels of innovation and technical expertise. In many cases, a critique analysis is presented of how projects performed and of the institutional conditions associated with their success or failure. Some 2000 projects are included in this compilation; most have worked, some have not. Information about all is presented to aid learning from these experiences. The four volumes in this set are arranged in state sections by geographic region, coinciding with the four Regional Solar Energy Centers. The table of contents is organized by project category so that maximum cross-referencing may be obtained. This volume includes information on the Southern Solar Energy Center Region. (WHK)

  14. Computer Simulation and Digital Resources for Plastic Surgery Psychomotor Education.

    Science.gov (United States)

    Diaz-Siso, J Rodrigo; Plana, Natalie M; Stranix, John T; Cutting, Court B; McCarthy, Joseph G; Flores, Roberto L

    2016-10-01

    Contemporary plastic surgery residents are increasingly challenged to learn a greater number of complex surgical techniques within a limited period. Surgical simulation and digital education resources have the potential to address some limitations of the traditional training model, and have been shown to accelerate knowledge and skills acquisition. Although animal, cadaver, and bench models are widely used for skills and procedure-specific training, digital simulation has not been fully embraced within plastic surgery. Digital educational resources may play a future role in a multistage strategy for skills and procedures training. The authors present two virtual surgical simulators addressing procedural cognition for cleft repair and craniofacial surgery. Furthermore, the authors describe how partnerships among surgical educators, industry, and philanthropy can be a successful strategy for the development and maintenance of digital simulators and educational resources relevant to plastic surgery training. It is our responsibility as surgical educators not only to create these resources, but to demonstrate their utility for enhanced trainee knowledge and technical skills development. Currently available digital resources should be evaluated in partnership with plastic surgery educational societies to guide trainees and practitioners toward effective digital content.

  15. Conception of a computer for the nuclear medical department of the Augsburg hospital center

    International Nuclear Information System (INIS)

    Graf, G.; Heidenreich, P.

    1984-01-01

    A computer system based on the Siemens R30 process computer has been employed at the Institute of Nuclear Medicine of the Augsburg Hospital Center since early 1981. This system, including the development and testing of organ-specific evaluation programs, was used as a basis for the conception of the new computer system for the department of nuclear medicine of the Augsburg Hospital Center. The computer system was extended and installed according to this conception when the new 1400-bed hospital was opened in the 3rd phase of construction in autumn 1982. (orig.) [de

  16. ``Carbon Credits'' for Resource-Bounded Computations Using Amortised Analysis

    Science.gov (United States)

    Jost, Steffen; Loidl, Hans-Wolfgang; Hammond, Kevin; Scaife, Norman; Hofmann, Martin

    Bounding resource usage is important for a number of areas, notably real-time embedded systems and safety-critical systems. In this paper, we present a fully automatic static type-based analysis for inferring upper bounds on resource usage for programs involving general algebraic datatypes and full recursion. Our method can easily be used to bound any countable resource, without needing to revisit proofs. We apply the analysis to the important metrics of worst-case execution time, stack- and heap-space usage. Our results from several realistic embedded control applications demonstrate good matches between our inferred bounds and measured worst-case costs for heap and stack usage. For time usage we infer good bounds for one application. Where we obtain less tight bounds, this is due to the use of software floating-point libraries.

  17. Quantum computing with incoherent resources and quantum jumps.

    Science.gov (United States)

    Santos, M F; Cunha, M Terra; Chaves, R; Carvalho, A R R

    2012-04-27

    Spontaneous emission and the inelastic scattering of photons are two natural processes usually associated with decoherence and the reduction in the capacity to process quantum information. Here we show that, when suitably detected, these photons are sufficient to build all the fundamental blocks needed to perform quantum computation in the emitting qubits while protecting them from deleterious dissipative effects. We exemplify this by showing how to efficiently prepare graph states for the implementation of measurement-based quantum computation.

  18. Center for Computer Security newsletter. Volume 2, Number 3

    Energy Technology Data Exchange (ETDEWEB)

    None

    1983-05-01

    The Fifth Computer Security Group Conference was held November 16 to 18, 1982, at the Knoxville Hilton in Knoxville, Tennessee. Attending were 183 people, representing the Department of Energy, DOE contractors, other government agencies, and vendor organizations. In these papers are abridgements of most of the papers presented in Knoxville. Less than half-a-dozen speakers failed to furnish either abstracts or full-text papers of their Knoxville presentations.

  19. 75 FR 78997 - Centers for Disease Control and Prevention/Health Resources and Services Administration (CDC/HRSA...

    Science.gov (United States)

    2010-12-17

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention Centers for Disease Control and Prevention/Health Resources and Services Administration (CDC/HRSA) Advisory Committee... and other committee management activities, for both the Centers for Disease Control and Prevention and...

  20. Computer-aided dispatch--traffic management center field operational test : state of Utah final report

    Science.gov (United States)

    2006-07-01

    This document provides the final report for the evaluation of the USDOT-sponsored Computer-Aided Dispatch Traffic Management Center Integration Field Operations Test in the State of Utah. The document discusses evaluation findings in the followin...

  1. Computer-aided dispatch--traffic management center field operational test : Washington State final report

    Science.gov (United States)

    2006-05-01

    This document provides the final report for the evaluation of the USDOT-sponsored Computer-Aided Dispatch - Traffic Management Center Integration Field Operations Test in the State of Washington. The document discusses evaluation findings in the foll...

  2. Information and psychomotor skills knowledge acquisition: A student-customer-centered and computer-supported approach.

    Science.gov (United States)

    Nicholson, Anita; Tobin, Mary

    2006-01-01

    This presentation will discuss coupling commercial and customized computer-supported teaching aids to provide BSN nursing students with a friendly customer-centered self-study approach to psychomotor skill acquisition.

  3. Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing.

    Directory of Open Access Journals (Sweden)

    Nan Zhang

    Full Text Available Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network.

  4. Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing.

    Science.gov (United States)

    Zhang, Nan; Yang, Xiaolong; Zhang, Min; Sun, Yan

    2016-01-01

    Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network.

  5. A Guide to the Data Resources of the Henry A. Murray Research Center of Radcliffe College: A Center for the Study of Lives [and] Index to [the] Guide.

    Science.gov (United States)

    Radcliffe Coll., Cambridge, MA. Henry A. Murray Research Center.

    The first of two volumes provides information about data resources available at the Henry A. Murray Research Center of Radcliffe College, a multidisciplinary research center that is a national repository for social and behavioral science data on human development and social change; topics of special concern to women are collection priorities. The…

  6. Bridging the digital divide by increasing computer and cancer literacy: community technology centers for head-start parents and families.

    Science.gov (United States)

    Salovey, Peter; Williams-Piehota, Pamela; Mowad, Linda; Moret, Marta Elisa; Edlund, Denielle; Andersen, Judith

    2009-01-01

    This article describes the establishment of two community technology centers affiliated with Head Start early childhood education programs focused especially on Latino and African American parents of children enrolled in Head Start. A 6-hour course concerned with computer and cancer literacy was presented to 120 parents and other community residents who earned a free, refurbished, Internet-ready computer after completing the program. Focus groups provided the basis for designing the structure and content of the course and modifying it during the project period. An outcomes-based assessment comparing program participants with 70 nonparticipants at baseline, immediately after the course ended, and 3 months later suggested that the program increased knowledge about computers and their use, knowledge about cancer and its prevention, and computer use including health information-seeking via the Internet. The creation of community computer technology centers requires the availability of secure space, capacity of a community partner to oversee project implementation, and resources of this partner to ensure sustainability beyond core funding.

  7. Surgical resource utilization in urban terrorist bombing: a computer simulation.

    Science.gov (United States)

    Hirshberg, A; Stein, M; Walden, R

    1999-09-01

    The objective of this study was to analyze the utilization of surgical staff and facilities during an urban terrorist bombing incident. A discrete-event computer model of the emergency room and related hospital facilities was constructed and implemented, based on cumulated data from 12 urban terrorist bombing incidents in Israel. The simulation predicts that the admitting capacity of the hospital depends primarily on the number of available surgeons and defines an optimal staff profile for surgeons, residents, and trauma nurses. The major bottlenecks in the flow of critical casualties are the shock rooms and the computed tomographic scanner but not the operating rooms. The simulation also defines the number of reinforcement staff needed to treat noncritical casualties and shows that radiology is the major obstacle to the flow of these patients. Computer simulation is an important new tool for the optimization of surgical service elements for a multiple-casualty situation.

  8. CNC Turning Center Advanced Operations. Computer Numerical Control Operator/Programmer. 444-332.

    Science.gov (United States)

    Skowronski, Steven D.; Tatum, Kenneth

    This student guide provides materials for a course designed to introduce the student to the operations and functions of a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 presents course expectations and syllabus, covers safety precautions, and describes the CNC turning center components, CNC…

  9. Intention and Usage of Computer Based Information Systems in Primary Health Centers

    Science.gov (United States)

    Hosizah; Kuntoro; Basuki N., Hari

    2016-01-01

    The computer-based information system (CBIS) is adopted by almost all of in health care setting, including the primary health center in East Java Province Indonesia. Some of softwares available were SIMPUS, SIMPUSTRONIK, SIKDA Generik, e-puskesmas. Unfortunately they were most of the primary health center did not successfully implemented. This…

  10. Optimal Computing Resource Management Based on Utility Maximization in Mobile Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Haoyu Meng

    2017-01-01

    Full Text Available Mobile crowdsourcing, as an emerging service paradigm, enables the computing resource requestor (CRR to outsource computation tasks to each computing resource provider (CRP. Considering the importance of pricing as an essential incentive to coordinate the real-time interaction among the CRR and CRPs, in this paper, we propose an optimal real-time pricing strategy for computing resource management in mobile crowdsourcing. Firstly, we analytically model the CRR and CRPs behaviors in form of carefully selected utility and cost functions, based on concepts from microeconomics. Secondly, we propose a distributed algorithm through the exchange of control messages, which contain the information of computing resource demand/supply and real-time prices. We show that there exist real-time prices that can align individual optimality with systematic optimality. Finally, we also take account of the interaction among CRPs and formulate the computing resource management as a game with Nash equilibrium achievable via best response. Simulation results demonstrate that the proposed distributed algorithm can potentially benefit both the CRR and CRPs. The coordinator in mobile crowdsourcing can thus use the optimal real-time pricing strategy to manage computing resources towards the benefit of the overall system.

  11. Assessing attitudes toward computers and the use of Internet resources among undergraduate microbiology students

    Science.gov (United States)

    Anderson, Delia Marie Castro

    Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in

  12. TOWARDS NEW COMPUTATIONAL ARCHITECTURES FOR MASS-COLLABORATIVE OPENEDUCATIONAL RESOURCES

    OpenAIRE

    Ismar Frango Silveira; Xavier Ochoa; Antonio Silva Sprock; Pollyana Notargiacomo Mustaro; Yosly C. Hernandez Bieluskas

    2011-01-01

    Open Educational Resources offer several benefits mostly in education and training. Being potentially reusable, their use can reduce time and cost of developing educational programs, so that these savings could be transferred directly to students through the production of a large range of open, freely available content, which vary from hypermedia to digital textbooks. This paper discuss this issue and presents a project and a research network that, in spite of being directed to Latin America'...

  13. Engaging Community Stakeholders to Evaluate the Design, Usability, and Acceptability of a Chronic Obstructive Pulmonary Disease Social Media Resource Center

    Science.gov (United States)

    Chaney, Beth; Chaney, Don; Paige, Samantha; Payne-Purvis, Caroline; Tennant, Bethany; Walsh-Childers, Kim; Sriram, PS; Alber, Julia

    2015-01-01

    Background Patients with chronic obstructive pulmonary disease (COPD) often report inadequate access to comprehensive patient education resources. Objective The purpose of this study was to incorporate community-engagement principles within a mixed-method research design to evaluate the usability and acceptability of a self-tailored social media resource center for medically underserved patients with COPD. Methods A multiphase sequential design (qual → QUANT → quant + QUAL) was incorporated into the current study, whereby a small-scale qualitative (qual) study informed the design of a social media website prototype that was tested with patients during a computer-based usability study (QUANT). To identify usability violations and determine whether or not patients found the website prototype acceptable for use, each patient was asked to complete an 18-item website usability and acceptability questionnaire, as well as a retrospective, in-depth, semistructured interview (quant + QUAL). Results The majority of medically underserved patients with COPD (n=8, mean 56 years, SD 7) found the social media website prototype to be easy to navigate and relevant to their self-management information needs. Mean responses on the 18-item website usability and acceptability questionnaire were very high on a scale of 1 (strongly disagree) to 5 (strongly agree) (mean 4.72, SD 0.33). However, the majority of patients identified several usability violations related to the prototype’s information design, interactive capabilities, and navigational structure. Specifically, 6 out of 8 (75%) patients struggled to create a log-in account to access the prototype, and 7 out of 8 patients (88%) experienced difficulty posting and replying to comments on an interactive discussion forum. Conclusions Patient perceptions of most social media website prototype features (eg, clickable picture-based screenshots of videos, comment tools) were largely positive. Mixed-method stakeholder feedback was

  14. Human resources management in fitness centers and their relationship with the organizational performance

    Directory of Open Access Journals (Sweden)

    Jerónimo García Fernández

    2014-12-01

    Full Text Available Purpose: Human capital is essential in organizations providing sports services. However, there are few studies that examine what practices are carried out and whether they, affect sports organizations achieve better results are. Therefore the aim of this paper is to analyze the practices of human resource management in private fitness centers and the relationship established with organizational performance.Design/methodology/approach: Questionnaire to 101 managers of private fitness centers in Spain, performing exploratory and confirmatory factor analysis, and linear regressions between the variables.Findings: In organizations of fitness, the findings show that training practices, reward, communication and selection are positively correlated with organizational performance.Research limitations/implications: The fact that you made a convenience sampling in a given country and reduce the extrapolation of the results to the market.Originality/value: First, it represents a contribution to the fact that there are no studies analyzing the management of human resources in sport organizations from the point of view of the top leaders. On the other hand, allows fitness center managers to adopt practices to improve organizational performance.

  15. PNNL supercomputer to become largest computing resource on the Grid

    CERN Multimedia

    2002-01-01

    Hewlett Packard announced that the US DOE Pacific Northwest National Laboratory will connect a 9.3-teraflop HP supercomputer to the DOE Science Grid. This will be the largest supercomputer attached to a computer grid anywhere in the world (1 page).

  16. Computer System Resource Requirements of Novice Programming Students.

    Science.gov (United States)

    Nutt, Gary J.

    The characteristics of jobs that constitute the mix for lower division FORTRAN classes in a university were investigated. Samples of these programs were also benchmarked on a larger central site computer and two minicomputer systems. It was concluded that a carefully chosen minicomputer system could offer service at least the equivalent of the…

  17. Scientific visualization in computational aerodynamics at NASA Ames Research Center

    Science.gov (United States)

    Bancroft, Gordon V.; Plessel, Todd; Merritt, Fergus; Walatka, Pamela P.; Watson, Val

    1989-01-01

    The visualization methods used in computational fluid dynamics research at the NASA-Ames Numerical Aerodynamic Simulation facility are examined, including postprocessing, tracking, and steering methods. The visualization requirements of the facility's three-dimensional graphical workstation are outlined and the types hardware and software used to meet these requirements are discussed. The main features of the facility's current and next-generation workstations are listed. Emphasis is given to postprocessing techniques, such as dynamic interactive viewing on the workstation and recording and playback on videodisk, tape, and 16-mm film. Postprocessing software packages are described, including a three-dimensional plotter, a surface modeler, a graphical animation system, a flow analysis software toolkit, and a real-time interactive particle-tracer.

  18. A Novel Resource Management Method of Providing Operating System as a Service for Mobile Transparent Computing

    Directory of Open Access Journals (Sweden)

    Yonghua Xiong

    2014-01-01

    Full Text Available This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU virtualization and mobile agent for mobile transparent computing (MTC to devise a method of managing shared resources and services management (SRSM. It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user’s requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable.

  19. A novel resource management method of providing operating system as a service for mobile transparent computing.

    Science.gov (United States)

    Xiong, Yonghua; Huang, Suzhen; Wu, Min; Zhang, Yaoxue; She, Jinhua

    2014-01-01

    This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU) virtualization and mobile agent for mobile transparent computing (MTC) to devise a method of managing shared resources and services management (SRSM). It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user's requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable.

  20. Logical and physical resource management in the common node of a distributed function laboratory computer network

    International Nuclear Information System (INIS)

    Stubblefield, F.W.

    1976-01-01

    A scheme for managing resources required for transaction processing in the common node of a distributed function computer system has been given. The scheme has been found to be satisfactory for all common node services provided so far

  1. Regional research exploitation of the LHC a case-study of the required computing resources

    CERN Document Server

    Almehed, S; Eerola, Paule Anna Mari; Mjörnmark, U; Smirnova, O G; Zacharatou-Jarlskog, C; Åkesson, T

    2002-01-01

    A simulation study to evaluate the required computing resources for a research exploitation of the Large Hadron Collider (LHC) has been performed. The evaluation was done as a case study, assuming existence of a Nordic regional centre and using the requirements for performing a specific physics analysis as a yard-stick. Other imput parameters were: assumption for the distribution of researchers at the institutions involved, an analysis model, and two different functional structures of the computing resources.

  2. Economic models for management of resources in peer-to-peer and grid computing

    Science.gov (United States)

    Buyya, Rajkumar; Stockinger, Heinz; Giddy, Jonathan; Abramson, David

    2001-07-01

    The accelerated development in Peer-to-Peer (P2P) and Grid computing has positioned them as promising next generation computing platforms. They enable the creation of Virtual Enterprises (VE) for sharing resources distributed across the world. However, resource management, application development and usage models in these environments is a complex undertaking. This is due to the geographic distribution of resources that are owned by different organizations or peers. The resource owners of each of these resources have different usage or access policies and cost models, and varying loads and availability. In order to address complex resource management issues, we have proposed a computational economy framework for resource allocation and for regulating supply and demand in Grid computing environments. The framework provides mechanisms for optimizing resource provider and consumer objective functions through trading and brokering services. In a real world market, there exist various economic models for setting the price for goods based on supply-and-demand and their value to the user. They include commodity market, posted price, tenders and auctions. In this paper, we discuss the use of these models for interaction between Grid components in deciding resource value and the necessary infrastructure to realize them. In addition to normal services offered by Grid computing systems, we need an infrastructure to support interaction protocols, allocation mechanisms, currency, secure banking, and enforcement services. Furthermore, we demonstrate the usage of some of these economic models in resource brokering through Nimrod/G deadline and cost-based scheduling for two different optimization strategies on the World Wide Grid (WWG) testbed that contains peer-to-peer resources located on five continents: Asia, Australia, Europe, North America, and South America.

  3. MCPLOTS: a particle physics resource based on volunteer computing

    CERN Document Server

    Karneyeu, A; Prestel, S; Skands, P Z

    2014-01-01

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC@HOME platform.

  4. MCPLOTS. A particle physics resource based on volunteer computing

    Energy Technology Data Exchange (ETDEWEB)

    Karneyeu, A. [Joint Inst. for Nuclear Research, Moscow (Russian Federation); Mijovic, L. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Irfu/SPP, CEA-Saclay, Gif-sur-Yvette (France); Prestel, S. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Lund Univ. (Sweden). Dept. of Astronomy and Theoretical Physics; Skands, P.Z. [European Organization for Nuclear Research (CERN), Geneva (Switzerland)

    2013-07-15

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC rate at HOME 2.0 platform.

  5. MCPLOTS: a particle physics resource based on volunteer computing

    International Nuclear Information System (INIS)

    Karneyeu, A.; Mijovic, L.; Prestel, S.; Skands, P.Z.

    2014-01-01

    The mcplots.cern.ch web site (mcplots) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the hepdata online database of experimental results and on the rivet Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the lhc rate at home 2.0 platform. (orig.)

  6. MCPLOTS. A particle physics resource based on volunteer computing

    International Nuclear Information System (INIS)

    Karneyeu, A.; Mijovic, L.; Prestel, S.

    2013-07-01

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC rate at HOME 2.0 platform.

  7. General-purpose computer networks and resource sharing in ERDA. Volume 3. Remote resource-sharing experience and findings

    Energy Technology Data Exchange (ETDEWEB)

    1977-07-15

    The investigation focused on heterogeneous networks in which a variety of dissimilar computers and operating systems were interconnected nationwide. Homogeneous networks, such as MFE net and SACNET, were not considered since they could not be used for general purpose resource sharing. Issues of privacy and security are of concern in any network activity. However, consideration of privacy and security of sensitive data arise to a much lesser degree in unclassified scientific research than in areas involving personal or proprietary information. Therefore, the existing mechanisms at individual sites for protecting sensitive data were relied on, and no new protection mechanisms to prevent infringement of privacy and security were attempted. Further development of ERDA networking will need to incorporate additional mechanisms to prevent infringement of privacy. The investigation itself furnishes an excellent example of computational resource sharing through a heterogeneous network. More than twenty persons, representing seven ERDA computing sites, made extensive use of both ERDA and non-ERDA computers in coordinating, compiling, and formatting the data which constitute the bulk of this report. Volume 3 analyzes the benefits and barriers encountered in actual resource sharing experience, and provides case histories of typical applications.

  8. Inbound Call Centers and Emotional Dissonance in the Job Demands - Resources Model.

    Science.gov (United States)

    Molino, Monica; Emanuel, Federica; Zito, Margherita; Ghislieri, Chiara; Colombo, Lara; Cortese, Claudio G

    2016-01-01

    Emotional labor, defined as the process of regulating feelings and expressions as part of the work role, is a major characteristic in call centers. In particular, interacting with customers, agents are required to show certain emotions that are considered acceptable by the organization, even though these emotions may be different from their true feelings. This kind of experience is defined as emotional dissonance and represents a feature of the job especially for call center inbound activities. The present study was aimed at investigating whether emotional dissonance mediates the relationship between job demands (workload and customer verbal aggression) and job resources (supervisor support, colleague support, and job autonomy) on the one hand, and, on the other, affective discomfort, using the job demands-resources model as a framework. The study also observed differences between two different types of inbound activities: customer assistance service (CA) and information service. The study involved agents of an Italian Telecommunication Company, 352 of whom worked in the CA and 179 in the information service. The hypothesized model was tested across the two groups through multi-group structural equation modeling. Analyses showed that CA agents experience greater customer verbal aggression and emotional dissonance than information service agents. RESULTS also showed, only for the CA group, a full mediation of emotional dissonance between workload and affective discomfort, and a partial mediation of customer verbal aggression and job autonomy, and affective discomfort. This study's findings contributed both to the emotional labor literature, investigating the mediational role of emotional dissonance in the job demands-resources model, and to call center literature, considering differences between two specific kinds of inbound activities. Suggestions for organizations and practitioners emerged in order to identify practical implications useful both to support

  9. Inbound Call Centers and Emotional Dissonance in the Job Demands – Resources Model

    Science.gov (United States)

    Molino, Monica; Emanuel, Federica; Zito, Margherita; Ghislieri, Chiara; Colombo, Lara; Cortese, Claudio G.

    2016-01-01

    Background: Emotional labor, defined as the process of regulating feelings and expressions as part of the work role, is a major characteristic in call centers. In particular, interacting with customers, agents are required to show certain emotions that are considered acceptable by the organization, even though these emotions may be different from their true feelings. This kind of experience is defined as emotional dissonance and represents a feature of the job especially for call center inbound activities. Aim: The present study was aimed at investigating whether emotional dissonance mediates the relationship between job demands (workload and customer verbal aggression) and job resources (supervisor support, colleague support, and job autonomy) on the one hand, and, on the other, affective discomfort, using the job demands-resources model as a framework. The study also observed differences between two different types of inbound activities: customer assistance service (CA) and information service. Method: The study involved agents of an Italian Telecommunication Company, 352 of whom worked in the CA and 179 in the information service. The hypothesized model was tested across the two groups through multi-group structural equation modeling. Results: Analyses showed that CA agents experience greater customer verbal aggression and emotional dissonance than information service agents. Results also showed, only for the CA group, a full mediation of emotional dissonance between workload and affective discomfort, and a partial mediation of customer verbal aggression and job autonomy, and affective discomfort. Conclusion: This study’s findings contributed both to the emotional labor literature, investigating the mediational role of emotional dissonance in the job demands-resources model, and to call center literature, considering differences between two specific kinds of inbound activities. Suggestions for organizations and practitioners emerged in order to identify

  10. Criticality Safety Information Resource Center Web portal: www.csirc.net

    International Nuclear Information System (INIS)

    Harmon, C.D. II; Jones, T.

    2000-01-01

    The Nuclear Criticality Safety Group (ESH-6) at Los Alamos National Laboratory (LANL) is in the process of collecting and archiving historical and technical information related to nuclear criticality safety from LANL and other facilities. In an ongoing effort, this information is being made available via the Criticality Safety Information Resource Center (CSIRC) web site, which is hosted and maintained by ESH-6 staff. Recently, the CSIRC Web site was recreated as a Web portal that provides the criticality safety community with much more than just archived data

  11. Inbound Call Centers and Emotional Dissonance in the Job Demands – Resources Model

    Directory of Open Access Journals (Sweden)

    Monica Molino

    2016-07-01

    Full Text Available Background: Emotional labor, defined as the process of regulating feelings and expressions as part of the work role, is a major characteristic in call centers. In particular, interacting with customers, agents are required to show certain emotions that are considered acceptable by the organization, even though these emotions may be different from their true feelings. This kind of experience is defined as emotional dissonance and represents a feature of the job especially for call center inbound activities. Aim: The present study was aimed at investigating whether emotional dissonance mediates the relationship between job demands (workload and customer verbal aggression and job resources (supervisor support, colleague support and job autonomy on the one hand, and, on the other, affective discomfort, using the job demands-resources model as a framework. The study also observed differences between two different types of inbound activities: customer assistance service and information service.Method: The study involved agents of an Italian Telecommunication Company, 352 of whom worked in the customer assistance service and 179 in the information service. The hypothesized model was tested across the two groups through multi-group structural equation modeling.Results: Analyses showed that customer assistance service agents experience greater customer verbal aggression and emotional dissonance than information service agents. Results also showed, only for the customer assistance service group, a full mediation of emotional dissonance between workload and affective discomfort, and a partial mediation of customer verbal aggression and job autonomy, and affective discomfort.Conclusion: This study’s findings contributed both to the emotional labor literature, investigating the mediational role of emotional dissonance in the job demands-resources model, and to call center literature, considering differences between two specific kinds of inbound activities

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  13. Campus Grids: Bringing Additional Computational Resources to HEP Researchers

    International Nuclear Information System (INIS)

    Weitzel, Derek; Fraser, Dan; Bockelman, Brian; Swanson, David

    2012-01-01

    It is common at research institutions to maintain multiple clusters that represent different owners or generations of hardware, or that fulfill different needs and policies. Many of these clusters are consistently under utilized while researchers on campus could greatly benefit from these unused capabilities. By leveraging principles from the Open Science Grid it is now possible to utilize these resources by forming a lightweight campus grid. The campus grids framework enables jobs that are submitted to one cluster to overflow, when necessary, to other clusters within the campus using whatever authentication mechanisms are available on campus. This framework is currently being used on several campuses to run HEP and other science jobs. Further, the framework has in some cases been expanded beyond the campus boundary by bridging campus grids into a regional grid, and can even be used to integrate resources from a national cyberinfrastructure such as the Open Science Grid. This paper will highlight 18 months of operational experiences creating campus grids in the US, and the different campus configurations that have successfully utilized the campus grid infrastructure.

  14. High Performance Computing in Science and Engineering '16 : Transactions of the High Performance Computing Center, Stuttgart (HLRS) 2016

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2016. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  15. Using High Performance Computing to Support Water Resource Planning

    Energy Technology Data Exchange (ETDEWEB)

    Groves, David G. [RAND Corporation, Santa Monica, CA (United States); Lembert, Robert J. [RAND Corporation, Santa Monica, CA (United States); May, Deborah W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Leek, James R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Syme, James [RAND Corporation, Santa Monica, CA (United States)

    2015-10-22

    In recent years, decision support modeling has embraced deliberation-withanalysis— an iterative process in which decisionmakers come together with experts to evaluate a complex problem and alternative solutions in a scientifically rigorous and transparent manner. Simulation modeling supports decisionmaking throughout this process; visualizations enable decisionmakers to assess how proposed strategies stand up over time in uncertain conditions. But running these simulation models over standard computers can be slow. This, in turn, can slow the entire decisionmaking process, interrupting valuable interaction between decisionmakers and analytics.

  16. BelleII@home: Integrate volunteer computing resources into DIRAC in a secure way

    Science.gov (United States)

    Wu, Wenjing; Hara, Takanori; Miyake, Hideki; Ueda, Ikuo; Kan, Wenxiao; Urquijo, Phillip

    2017-10-01

    The exploitation of volunteer computing resources has become a popular practice in the HEP computing community as the huge amount of potential computing power it provides. In the recent HEP experiments, the grid middleware has been used to organize the services and the resources, however it relies heavily on the X.509 authentication, which is contradictory to the untrusted feature of volunteer computing resources, therefore one big challenge to utilize the volunteer computing resources is how to integrate them into the grid middleware in a secure way. The DIRAC interware which is commonly used as the major component of the grid computing infrastructure for several HEP experiments proposes an even bigger challenge to this paradox as its pilot is more closely coupled with operations requiring the X.509 authentication compared to the implementations of pilot in its peer grid interware. The Belle II experiment is a B-factory experiment at KEK, and it uses DIRAC for its distributed computing. In the project of BelleII@home, in order to integrate the volunteer computing resources into the Belle II distributed computing platform in a secure way, we adopted a new approach which detaches the payload running from the Belle II DIRAC pilot which is a customized pilot pulling and processing jobs from the Belle II distributed computing platform, so that the payload can run on volunteer computers without requiring any X.509 authentication. In this approach we developed a gateway service running on a trusted server which handles all the operations requiring the X.509 authentication. So far, we have developed and deployed the prototype of BelleII@home, and tested its full workflow which proves the feasibility of this approach. This approach can also be applied on HPC systems whose work nodes do not have outbound connectivity to interact with the DIRAC system in general.

  17. Decision making in water resource planning: Models and computer graphics

    Energy Technology Data Exchange (ETDEWEB)

    Fedra, K; Carlsen, A J [ed.

    1987-01-01

    This paper describes some basic concepts of simulation-based decision support systems for water resources management and the role of symbolic, graphics-based user interfaces. Designed to allow direct and easy access to advanced methods of analysis and decision support for a broad and heterogeneous group of users, these systems combine data base management, system simulation, operations research techniques such as optimization, interactive data analysis, elements of advanced decision technology, and artificial intelligence, with a friendly and conversational, symbolic display oriented user interface. Important features of the interface are the use of several parallel or alternative styles of interaction and display, indlucing colour graphics and natural language. Combining quantitative numerical methods with qualitative and heuristic approaches, and giving the user direct and interactive control over the systems function, human knowledge, experience and judgement are integrated with formal approaches into a tightly coupled man-machine system through an intelligent and easily accessible user interface. 4 drawings, 42 references.

  18. Monitoring of computing resource utilization of the ATLAS experiment

    International Nuclear Information System (INIS)

    Rousseau, David; Vukotic, Ilija; Schaffer, RD; Dimitrov, Gancho; Aidel, Osman; Albrand, Solveig

    2012-01-01

    Due to the good performance of the LHC accelerator, the ATLAS experiment has seen higher than anticipated levels for both the event rate and the average number of interactions per bunch crossing. In order to respond to these changing requirements, the current and future usage of CPU, memory and disk resources has to be monitored, understood and acted upon. This requires data collection at a fairly fine level of granularity: the performance of each object written and each algorithm run, as well as a dozen per-job variables, are gathered for the different processing steps of Monte Carlo generation and simulation and the reconstruction of both data and Monte Carlo. We present a system to collect and visualize the data from both the online Tier-0 system and distributed grid production jobs. Around 40 GB of performance data are expected from up to 200k jobs per day, thus making performance optimization of the underlying Oracle database of utmost importance.

  19. Mobile Cloud Computing: Resource Discovery, Session Connectivity and Other Open Issues

    NARCIS (Netherlands)

    Schüring, Markus; Karagiannis, Georgios

    2011-01-01

    Abstract—Cloud computing can be considered as a model that provides network access to a shared pool of resources, such as storage and computing power, which can be rapidly provisioned and released with minimal management effort. This paper describes a research activity in the area of mobile cloud

  20. KEY ISSUES OF CONCEPTS' FORMATION OF THE NETWORK OF RESOURCE CENTER OF DISTANCE EDUCATION OF GENERAL EDUCATION INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    Yuriy M. Bogachkov

    2013-06-01

    Full Text Available In the article the problem of constructing a network of resource centers for Distance Education to meet the needs of general secondary schools is presented. Modern educational trends in the use of Internet services in education are viewed.  Main contradictions, solution of which helps to create a network of resource centers, are identified. The definition of key terms related to the range of issues are given. The basic categories of participants, who  implementation of e-learning and networking are oriented on. There are considered the basic tasks of  distance education resource centers' functioning and types of supporting: personnel, regulatory, informative, systematic and  technical etc. The review of possible models of implementation of  students' distance education is reviewed . Three options for business models of resource centers, depending on funding  sources are offered.

  1. Improvements to PATRIC, the all-bacterial Bioinformatics Database and Analysis Resource Center

    Science.gov (United States)

    Wattam, Alice R.; Davis, James J.; Assaf, Rida; Boisvert, Sébastien; Brettin, Thomas; Bun, Christopher; Conrad, Neal; Dietrich, Emily M.; Disz, Terry; Gabbard, Joseph L.; Gerdes, Svetlana; Henry, Christopher S.; Kenyon, Ronald W.; Machi, Dustin; Mao, Chunhong; Nordberg, Eric K.; Olsen, Gary J.; Murphy-Olson, Daniel E.; Olson, Robert; Overbeek, Ross; Parrello, Bruce; Pusch, Gordon D.; Shukla, Maulik; Vonstein, Veronika; Warren, Andrew; Xia, Fangfang; Yoo, Hyunseung; Stevens, Rick L.

    2017-01-01

    The Pathosystems Resource Integration Center (PATRIC) is the bacterial Bioinformatics Resource Center (https://www.patricbrc.org). Recent changes to PATRIC include a redesign of the web interface and some new services that provide users with a platform that takes them from raw reads to an integrated analysis experience. The redesigned interface allows researchers direct access to tools and data, and the emphasis has changed to user-created genome-groups, with detailed summaries and views of the data that researchers have selected. Perhaps the biggest change has been the enhanced capability for researchers to analyze their private data and compare it to the available public data. Researchers can assemble their raw sequence reads and annotate the contigs using RASTtk. PATRIC also provides services for RNA-Seq, variation, model reconstruction and differential expression analysis, all delivered through an updated private workspace. Private data can be compared by ‘virtual integration’ to any of PATRIC's public data. The number of genomes available for comparison in PATRIC has expanded to over 80 000, with a special emphasis on genomes with antimicrobial resistance data. PATRIC uses this data to improve both subsystem annotation and k-mer classification, and tags new genomes as having signatures that indicate susceptibility or resistance to specific antibiotics. PMID:27899627

  2. The effective use of virtualization for selection of data centers in a cloud computing environment

    Science.gov (United States)

    Kumar, B. Santhosh; Parthiban, Latha

    2018-04-01

    Data centers are the places which consist of network of remote servers to store, access and process the data. Cloud computing is a technology where users worldwide will submit the tasks and the service providers will direct the requests to the data centers which are responsible for execution of tasks. The servers in the data centers need to employ the virtualization concept so that multiple tasks can be executed simultaneously. In this paper we proposed an algorithm for data center selection based on energy of virtual machines created in server. The virtualization energy in each of the server is calculated and total energy of the data center is obtained by the summation of individual server energy. The tasks submitted are routed to the data center with least energy consumption which will result in minimizing the operational expenses of a service provider.

  3. Performance evaluation of data center service localization based on virtual resource migration in software defined elastic optical network.

    Science.gov (United States)

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; Tan, Yuanlong; Lin, Yi; Han, Jianrui; Lee, Young

    2015-09-07

    Data center interconnection with elastic optical network is a promising scenario to meet the high burstiness and high-bandwidth requirements of data center services. In our previous work, we implemented cross stratum optimization of optical network and application stratums resources that allows to accommodate data center services. In view of this, this study extends the data center resources to user side to enhance the end-to-end quality of service. We propose a novel data center service localization (DCSL) architecture based on virtual resource migration in software defined elastic data center optical network. A migration evaluation scheme (MES) is introduced for DCSL based on the proposed architecture. The DCSL can enhance the responsiveness to the dynamic end-to-end data center demands, and effectively reduce the blocking probability to globally optimize optical network and application resources. The overall feasibility and efficiency of the proposed architecture are experimentally verified on the control plane of our OpenFlow-based enhanced SDN testbed. The performance of MES scheme under heavy traffic load scenario is also quantitatively evaluated based on DCSL architecture in terms of path blocking probability, provisioning latency and resource utilization, compared with other provisioning scheme.

  4. Plutonium research and related activities at the Amarillo National Resource Center for Plutonium

    International Nuclear Information System (INIS)

    Hartley, R.S.; Beard, C.A.; Barnes, D.L.

    1998-01-01

    With the end of the Cold War, the US and Russia are reducing their nuclear weapons stockpiles. What to do with the materials from thousands of excess nuclear weapons is an important international challenge. How to handle the remaining US stockpile to ensure safe storage and reliability, in light of the aging support infrastructure, is an important national challenge. To help address these challenges and related issues, the Amarillo National Resource Center for Plutonium is working on behalf of the State of Texas with the US Department of Energy (DOE). The center directs three major programs that address the key aspects of the plutonium management issue: (1) the Communications, Education, Training and Community Involvement Program, which focuses on informing the public about plutonium and providing technical education at all levels; (2) the Environmental, Safety, and Health (ES and H) Program, which investigates the key ES and H impacts of activities related to the DOE weapons complex in Texas; and (3) the Nuclear and Other Materials Program, which is aimed at minimizing safety and proliferation risks by helping to develop and advocate safe stewardship, storage, and disposition of nuclear weapons materials. This paper provides an overview of the center's nuclear activities described in four broad categories of international activities, materials safety, plutonium storage, and plutonium disposition

  5. Sensor and computing resource management for a small satellite

    Science.gov (United States)

    Bhatia, Abhilasha; Goehner, Kyle; Sand, John; Straub, Jeremy; Mohammad, Atif; Korvald, Christoffer; Nervold, Anders Kose

    A small satellite in a low-Earth orbit (e.g., approximately a 300 to 400 km altitude) has an orbital velocity in the range of 8.5 km/s and completes an orbit approximately every 90 minutes. For a satellite with minimal attitude control, this presents a significant challenge in obtaining multiple images of a target region. Presuming an inclination in the range of 50 to 65 degrees, a limited number of opportunities to image a given target or communicate with a given ground station are available, over the course of a 24-hour period. For imaging needs (where solar illumination is required), the number of opportunities is further reduced. Given these short windows of opportunity for imaging, data transfer, and sending commands, scheduling must be optimized. In addition to the high-level scheduling performed for spacecraft operations, payload-level scheduling is also required. The mission requires that images be post-processed to maximize spatial resolution and minimize data transfer (through removing overlapping regions). The payload unit includes GPS and inertial measurement unit (IMU) hardware to aid in image alignment for the aforementioned. The payload scheduler must, thus, split its energy and computing-cycle budgets between determining an imaging sequence (required to capture the highly-overlapping data required for super-resolution and adjacent areas required for mosaicking), processing the imagery (to perform the super-resolution and mosaicking) and preparing the data for transmission (compressing it, etc.). This paper presents an approach for satellite control, scheduling and operations that allows the cameras, GPS and IMU to be used in conjunction to acquire higher-resolution imagery of a target region.

  6. AHPCRC (Army High Performance Computing Research Center) Bulletin. Volume 1, Issue 2

    Science.gov (United States)

    2011-01-01

    area and the researchers working on these projects. Also inside: news from the AHPCRC consortium partners at Morgan State University and the NASA ...Computing Research Center is provided by the supercomputing and research facilities at Stanford University and at the NASA Ames Research Center at...atomic and molecular level, he said. He noted that “every general would like to have” a Star Trek -like holodeck, where holographic avatars could

  7. Software Defined Resource Orchestration System for Multitask Application in Heterogeneous Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Qi Qi

    2016-01-01

    Full Text Available The mobile cloud computing (MCC that combines mobile computing and cloud concept takes wireless access network as the transmission medium and uses mobile devices as the client. When offloading the complicated multitask application to the MCC environment, each task executes individually in terms of its own computation, storage, and bandwidth requirement. Due to user’s mobility, the provided resources contain different performance metrics that may affect the destination choice. Nevertheless, these heterogeneous MCC resources lack integrated management and can hardly cooperate with each other. Thus, how to choose the appropriate offload destination and orchestrate the resources for multitask is a challenge problem. This paper realizes a programming resource provision for heterogeneous energy-constrained computing environments, where a software defined controller is responsible for resource orchestration, offload, and migration. The resource orchestration is formulated as multiobjective optimal problem that contains the metrics of energy consumption, cost, and availability. Finally, a particle swarm algorithm is used to obtain the approximate optimal solutions. Simulation results show that the solutions for all of our studied cases almost can hit Pareto optimum and surpass the comparative algorithm in approximation, coverage, and execution time.

  8. Impact of configuration management system of computer center on support of scientific projects throughout their lifecycle

    International Nuclear Information System (INIS)

    Bogdanov, A.V.; Yuzhanin, N.V.; Zolotarev, V.I.; Ezhakova, T.R.

    2017-01-01

    In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is reviewed and development of the corresponding elements of the system is described in the present paper.

  9. Impact of configuration management system of computer center on support of scientific projects throughout their lifecycle

    Science.gov (United States)

    Bogdanov, A. V.; Iuzhanin, N. V.; Zolotarev, V. I.; Ezhakova, T. R.

    2017-12-01

    In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is being reviewed and development of the corresponding elements of the system is being described in the present paper.

  10. Performance evaluation of multi-stratum resources integrated resilience for software defined inter-data center interconnect.

    Science.gov (United States)

    Yang, Hui; Zhang, Jie; Zhao, Yongli; Ji, Yuefeng; Wu, Jialin; Lin, Yi; Han, Jianrui; Lee, Young

    2015-05-18

    Inter-data center interconnect with IP over elastic optical network (EON) is a promising scenario to meet the high burstiness and high-bandwidth requirements of data center services. In our previous work, we implemented multi-stratum resources integration among IP networks, optical networks and application stratums resources that allows to accommodate data center services. In view of this, this study extends to consider the service resilience in case of edge optical node failure. We propose a novel multi-stratum resources integrated resilience (MSRIR) architecture for the services in software defined inter-data center interconnect based on IP over EON. A global resources integrated resilience (GRIR) algorithm is introduced based on the proposed architecture. The MSRIR can enable cross stratum optimization and provide resilience using the multiple stratums resources, and enhance the data center service resilience responsiveness to the dynamic end-to-end service demands. The overall feasibility and efficiency of the proposed architecture is experimentally verified on the control plane of our OpenFlow-based enhanced SDN (eSDN) testbed. The performance of GRIR algorithm under heavy traffic load scenario is also quantitatively evaluated based on MSRIR architecture in terms of path blocking probability, resilience latency and resource utilization, compared with other resilience algorithms.

  11. Technical Data Management Center: a focal point for meteorological and other environmental transport computing technology

    International Nuclear Information System (INIS)

    McGill, B.; Maskewitz, B.F.; Trubey, D.K.

    1981-01-01

    The Technical Data Management Center, collecting, packaging, analyzing, and distributing information, computer technology and data which includes meteorological and other environmental transport work is located at the Oak Ridge National Laboratory, within the Engineering Physics Division. Major activities include maintaining a collection of computing technology and associated literature citations to provide capabilities for meteorological and environmental work. Details of the activities on behalf of TDMC's sponsoring agency, the US Nuclear Regulatory Commission, are described

  12. High performance computing in science and engineering '09: transactions of the High Performance Computing Center, Stuttgart (HLRS) 2009

    National Research Council Canada - National Science Library

    Nagel, Wolfgang E; Kröner, Dietmar; Resch, Michael

    2010-01-01

    ...), NIC/JSC (J¨ u lich), and LRZ (Munich). As part of that strategic initiative, in May 2009 already NIC/JSC has installed the first phase of the GCS HPC Tier-0 resources, an IBM Blue Gene/P with roughly 300.000 Cores, this time in J¨ u lich, With that, the GCS provides the most powerful high-performance computing infrastructure in Europe alread...

  13. The Criticality Safety Information Resource Center (CSIRC) at Los Alamos National Laboratory

    International Nuclear Information System (INIS)

    Henderson, B.D.; Meade, R.A.; Pruvost, N.L.

    1999-01-01

    The Criticality Safety Information Resource Center (CSIRC) at Los Alamos National Laboratory (LANL) is a program jointly funded by the U.S. Department of Energy (DOE) and the U.S. Nuclear Regulatory Commission (NRC) in conjunction with the Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 97-2. The goal of CSIRC is to preserve primary criticality safety documentation from U.S. critical experimental sites and to make this information available for the benefit of the technical community. Progress in archiving criticality safety primary documents at the LANL archives as well as efforts to make this information available to researchers are discussed. The CSIRC project has a natural linkage to the International Criticality Safety Benchmark Evaluation Project (ICSBEP). This paper raises the possibility that the CSIRC project will evolve in a fashion similar to the ICSBEP. Exploring the implications of linking the CSIRC to the international criticality safety community is the motivation for this paper

  14. Mars Atmospheric In Situ Resource Utilization Projects at the Kennedy Space Center

    Science.gov (United States)

    Muscatello, A. C.; Hintze, P. E.; Caraccio, A. J.; Bayliss, J. A.; Karr, L. J.; Paley, M. S.; Marone, M. J.; Gibson, T. L.; Surma, J. M.; Mansell, J. M.; hide

    2016-01-01

    The atmosphere of Mars, which is approximately 95% carbon dioxide (CO2), is a rich resource for the human exploration of the red planet, primarily by the production of rocket propellants and oxygen for life support. Three recent projects led by NASA's Kennedy Space Center have been investigating the processing of CO2. The first project successfully demonstrated the Mars Atmospheric Processing Module (APM), which freezes CO2 with cryocoolers and combines sublimated CO2 with hydrogen to make methane and water. The second project absorbs CO2 with Ionic Liquids and electrolyzes it with water to make methane and oxygen, but with limited success so far. A third project plans to recover up to 100% of the oxygen in spacecraft respiratory CO2. A combination of the Reverse Water Gas Shift reaction and the Boudouard reaction eventually fill the reactor up with carbon, stopping the process. A system to continuously remove and collect carbon is under construction.

  15. X-ray microscopy resource center at the Advanced Light Source

    International Nuclear Information System (INIS)

    Meyer-Ilse, W.; Koike, M.; Beguiristain, R.; Maser, J.; Attwood, D.

    1992-07-01

    An x-ray microscopy resource center for biological x-ray imaging vvill be built at the Advanced Light Source (ALS) in Berkeley. The unique high brightness of the ALS allows short exposure times and high image quality. Two microscopes, an x-ray microscope (XM) and a scanning x-ray microscope (SXM) are planned. These microscopes serve complementary needs. The XM gives images in parallel at comparable short exposure times, and the SXM is optimized for low radiation doses applied to the sample. The microscopes extend visible light microscopy towards significantly higher resolution and permit images of objects in an aqueous medium. High resolution is accomplished by the use of Fresnel zone plates. Design considerations to serve the needs of biological x-ray microscopy are given. Also the preliminary design of the microscopes is presented. Multiple wavelength and multiple view images will provide elemental contrast and some degree of 3D information

  16. Mars Atmospheric In Situ Resource Utilization Projects at the Kennedy Space Center

    Science.gov (United States)

    Muscatello, Anthony; Hintze, Paul; Meier, Anne; Bayliss, Jon; Karr, Laurel; Paley, Steve; Marone, Matt; Gibson, Tracy; Surma, Jan; Mansell, Matt; hide

    2016-01-01

    The atmosphere of Mars, which is 96 percent carbon dioxide (CO2), is a rich resource for the human exploration of the red planet, primarily by the production of rocket propellants and oxygen for life support. Three recent projects led by NASAs Kennedy Space Center have been investigating the processing of CO2. The first project successfully demonstrated the Mars Atmospheric Processing Module (APM), which freezes CO2 with cryocoolers and combines sublimated CO2 with hydrogen to make methane and water. The second project absorbs CO2 with Ionic Liquids and electrolyzes it with water to make methane and oxygen, but with limited success so far. A third project plans to recover up to 100 of the oxygen in spacecraft respiratory CO2. A combination of the Reverse Water Gas Shift reaction and the Boudouard reaction eventually fill the reactor up with carbon, stopping the process. A system to continuously remove and collect carbon has been tested with encouraging results.

  17. Computational resources for ribosome profiling: from database to Web server and software.

    Science.gov (United States)

    Wang, Hongwei; Wang, Yan; Xie, Zhi

    2017-08-14

    Ribosome profiling is emerging as a powerful technique that enables genome-wide investigation of in vivo translation at sub-codon resolution. The increasing application of ribosome profiling in recent years has achieved remarkable progress toward understanding the composition, regulation and mechanism of translation. This benefits from not only the awesome power of ribosome profiling but also an extensive range of computational resources available for ribosome profiling. At present, however, a comprehensive review on these resources is still lacking. Here, we survey the recent computational advances guided by ribosome profiling, with a focus on databases, Web servers and software tools for storing, visualizing and analyzing ribosome profiling data. This review is intended to provide experimental and computational biologists with a reference to make appropriate choices among existing resources for the question at hand. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Clinical support role for a pharmacy technician within a primary care resource center.

    Science.gov (United States)

    Fera, Toni; Kanel, Keith T; Bolinger, Meghan L; Fink, Amber E; Iheasirim, Serah

    2018-02-01

    The creation of a clinical support role for a pharmacy technician within a primary care resource center is described. In the Primary Care Resource Center (PCRC) Project, hospital-based care transition coordination hubs staffed by nurses and pharmacist teams were created in 6 independent community hospitals. At the largest site, patient volume for targeted diseases challenged the ability of the PCRC pharmacist to provide expected elements of care to targeted patients. Creation of a new pharmacy technician clinical support role was implemented as a cost-effective option to increase the pharmacist's efficiency. The pharmacist's work processes were reviewed and technical functions identified that could be assigned to a specially trained pharmacy technician under the direction of the PCRC pharmacist. Daily tasks performed by the pharmacy technician included maintenance of the patient roster and pending discharges, retrieval and documentation of pertinent laboratory and diagnostic test information from the patient's medical record, assembly of patient medication education materials, and identification of discrepancies between disparate systems' medication records. In the 6 months after establishing the PCRC pharmacy technician role, the pharmacist's completion of comprehensive medication reviews (CMRs) for target patients increased by 40.5% ( p = 0.0223), driven largely by a 42.4% ( p technician to augment pharmacist care in a PCRC team extended the reach of the pharmacist and allowed more time for the pharmacist to engage patients. Technician support enabled the pharmacist to complete more CMRs and reduced the time required for chart reviews. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  19. Open Educational Resources: The Role of OCW, Blogs and Videos in Computer Networks Classroom

    Directory of Open Access Journals (Sweden)

    Pablo Gil

    2012-09-01

    Full Text Available This paper analyzes the learning experiences and opinions obtained from a group of undergraduate students in their interaction with several on-line multimedia resources included in a free on-line course about Computer Networks. These new educational resources employed are based on the Web2.0 approach such as blogs, videos and virtual labs which have been added in a web-site for distance self-learning.

  20. AN ENHANCED METHOD FOREXTENDING COMPUTATION AND RESOURCES BY MINIMIZING SERVICE DELAY IN EDGE CLOUD COMPUTING

    OpenAIRE

    B.Bavishna*1, Mrs.M.Agalya2 & Dr.G.Kavitha3

    2018-01-01

    A lot of research has been done in the field of cloud computing in computing domain. For its effective performance, variety of algorithms has been proposed. The role of virtualization is significant and its performance is dependent on VM Migration and allocation. More of the energy is absorbed in cloud; therefore, the utilization of numerous algorithms is required for saving energy and efficiency enhancement in the proposed work. In the proposed work, green algorithm has been considered with ...

  1. CENTER CONDITIONS AND CYCLICITY FOR A FAMILY OF CUBIC SYSTEMS: COMPUTER ALGEBRA APPROACH.

    Science.gov (United States)

    Ferčec, Brigita; Mahdi, Adam

    2013-01-01

    Using methods of computational algebra we obtain an upper bound for the cyclicity of a family of cubic systems. We overcame the problem of nonradicality of the associated Bautin ideal by moving from the ring of polynomials to a coordinate ring. Finally, we determine the number of limit cycles bifurcating from each component of the center variety.

  2. CNC Turning Center Operations and Prove Out. Computer Numerical Control Operator/Programmer. 444-334.

    Science.gov (United States)

    Skowronski, Steven D.

    This student guide provides materials for a course designed to instruct the student in the recommended procedures used when setting up tooling and verifying part programs for a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 discusses course content and reviews and demonstrates set-up procedures…

  3. Load/resource matching for period-of-record computer simulation

    International Nuclear Information System (INIS)

    Lindsey, E.D. Jr.; Robbins, G.E. III

    1991-01-01

    The Southwestern Power Administration (Southwestern), an agency of the Department of Energy, is responsible for marketing the power and energy produced at Federal hydroelectric power projects developed by the U.S. Army Corps of Engineers in the southwestern United States. This paper reports that in order to maximize benefits from limited resources, to evaluate proposed changes in the operation of existing projects, and to determine the feasibility and marketability of proposed new projects, Southwestern utilizes a period-of-record computer simulation model created in the 1960's. Southwestern is constructing a new computer simulation model to take advantage of changes in computers, policy, and procedures. Within all hydroelectric power reservoir systems, the ability of the resources to match the load demand is critical and presents complex problems. Therefore, the method used to compare available energy resources to energy load demands is a very important aspect of the new model. Southwestern has developed an innovative method which compares a resource duration curve with a load duration curve, adjusting the resource duration curve to make the most efficient use of the available resources

  4. An integrated system for land resources supervision based on the IoT and cloud computing

    Science.gov (United States)

    Fang, Shifeng; Zhu, Yunqiang; Xu, Lida; Zhang, Jinqu; Zhou, Peiji; Luo, Kan; Yang, Jie

    2017-01-01

    Integrated information systems are important safeguards for the utilisation and development of land resources. Information technologies, including the Internet of Things (IoT) and cloud computing, are inevitable requirements for the quality and efficiency of land resources supervision tasks. In this study, an economical and highly efficient supervision system for land resources has been established based on IoT and cloud computing technologies; a novel online and offline integrated system with synchronised internal and field data that includes the entire process of 'discovering breaches, analysing problems, verifying fieldwork and investigating cases' was constructed. The system integrates key technologies, such as the automatic extraction of high-precision information based on remote sensing, semantic ontology-based technology to excavate and discriminate public sentiment on the Internet that is related to illegal incidents, high-performance parallel computing based on MapReduce, uniform storing and compressing (bitwise) technology, global positioning system data communication and data synchronisation mode, intelligent recognition and four-level ('device, transfer, system and data') safety control technology. The integrated system based on a 'One Map' platform has been officially implemented by the Department of Land and Resources of Guizhou Province, China, and was found to significantly increase the efficiency and level of land resources supervision. The system promoted the overall development of informatisation in fields related to land resource management.

  5. Rational use of cognitive resources: levels of analysis between the computational and the algorithmic.

    Science.gov (United States)

    Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D

    2015-04-01

    Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis." Copyright © 2015 Cognitive Science Society, Inc.

  6. Accurate Computation of Periodic Regions' Centers in the General M-Set with Integer Index Number

    Directory of Open Access Journals (Sweden)

    Wang Xingyuan

    2010-01-01

    Full Text Available This paper presents two methods for accurately computing the periodic regions' centers. One method fits for the general M-sets with integer index number, the other fits for the general M-sets with negative integer index number. Both methods improve the precision of computation by transforming the polynomial equations which determine the periodic regions' centers. We primarily discuss the general M-sets with negative integer index, and analyze the relationship between the number of periodic regions' centers on the principal symmetric axis and in the principal symmetric interior. We can get the centers' coordinates with at least 48 significant digits after the decimal point in both real and imaginary parts by applying the Newton's method to the transformed polynomial equation which determine the periodic regions' centers. In this paper, we list some centers' coordinates of general M-sets' k-periodic regions (k=3,4,5,6 for the index numbers α=−25,−24,…,−1 , all of which have highly numerical accuracy.

  7. High Performance Computing in Science and Engineering '08 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2009-01-01

    The discussions and plans on all scienti?c, advisory, and political levels to realize an even larger “European Supercomputer” in Germany, where the hardware costs alone will be hundreds of millions Euro – much more than in the past – are getting closer to realization. As part of the strategy, the three national supercomputing centres HLRS (Stuttgart), NIC/JSC (Julic ¨ h) and LRZ (Munich) have formed the Gauss Centre for Supercomputing (GCS) as a new virtual organization enabled by an agreement between the Federal Ministry of Education and Research (BMBF) and the state ministries for research of Baden-Wurttem ¨ berg, Bayern, and Nordrhein-Westfalen. Already today, the GCS provides the most powerful high-performance computing - frastructure in Europe. Through GCS, HLRS participates in the European project PRACE (Partnership for Advances Computing in Europe) and - tends its reach to all European member countries. These activities aligns well with the activities of HLRS in the European HPC infrastructur...

  8. PredMP: A Web Resource for Computationally Predicted Membrane Proteins via Deep Learning

    KAUST Repository

    Wang, Sheng; Fei, Shiyang; Zongan, Wang; Li, Yu; Zhao, Feng; Gao, Xin

    2018-01-01

    structures in Protein Data Bank (PDB). To elucidate the MP structures computationally, we developed a novel web resource, denoted as PredMP (http://52.87.130.56:3001/#/proteinindex), that delivers one-dimensional (1D) annotation of the membrane topology

  9. Resource-constrained project scheduling: computing lower bounds by solving minimum cut problems

    NARCIS (Netherlands)

    Möhring, R.H.; Nesetril, J.; Schulz, A.S.; Stork, F.; Uetz, Marc Jochen

    1999-01-01

    We present a novel approach to compute Lagrangian lower bounds on the objective function value of a wide class of resource-constrained project scheduling problems. The basis is a polynomial-time algorithm to solve the following scheduling problem: Given a set of activities with start-time dependent

  10. Selecting, Evaluating and Creating Policies for Computer-Based Resources in the Behavioral Sciences and Education.

    Science.gov (United States)

    Richardson, Linda B., Comp.; And Others

    This collection includes four handouts: (1) "Selection Critria Considerations for Computer-Based Resources" (Linda B. Richardson); (2) "Software Collection Policies in Academic Libraries" (a 24-item bibliography, Jane W. Johnson); (3) "Circulation and Security of Software" (a 19-item bibliography, Sara Elizabeth Williams); and (4) "Bibliography of…

  11. The Usage of informal computer based communication in the context of organization’s technological resources

    OpenAIRE

    Raišienė, Agota Giedrė; Jonušauskas, Steponas

    2011-01-01

    Purpose of the article is theoretically and practically analyze the features of informal computer based communication in the context of organization's technological resources. Methodology - meta analysis, survey and descriptive analysis. According to scientists, the functions of informal communication cover sharing of work related information, coordination of team activities, spread of organizational culture and feeling of interdependence and affinity. Also, informal communication widens the ...

  12. Developing Online Learning Resources: Big Data, Social Networks, and Cloud Computing to Support Pervasive Knowledge

    Science.gov (United States)

    Anshari, Muhammad; Alas, Yabit; Guan, Lim Sei

    2016-01-01

    Utilizing online learning resources (OLR) from multi channels in learning activities promise extended benefits from traditional based learning-centred to a collaborative based learning-centred that emphasises pervasive learning anywhere and anytime. While compiling big data, cloud computing, and semantic web into OLR offer a broader spectrum of…

  13. Computer Processing 10-20-30. Teacher's Manual. Senior High School Teacher Resource Manual.

    Science.gov (United States)

    Fisher, Mel; Lautt, Ray

    Designed to help teachers meet the program objectives for the computer processing curriculum for senior high schools in the province of Alberta, Canada, this resource manual includes the following sections: (1) program objectives; (2) a flowchart of curriculum modules; (3) suggestions for short- and long-range planning; (4) sample lesson plans;…

  14. Photonic entanglement as a resource in quantum computation and quantum communication

    OpenAIRE

    Prevedel, Robert; Aspelmeyer, Markus; Brukner, Caslav; Jennewein, Thomas; Zeilinger, Anton

    2008-01-01

    Entanglement is an essential resource in current experimental implementations for quantum information processing. We review a class of experiments exploiting photonic entanglement, ranging from one-way quantum computing over quantum communication complexity to long-distance quantum communication. We then propose a set of feasible experiments that will underline the advantages of photonic entanglement for quantum information processing.

  15. Modified stretched exponential model of computer system resources management limitations-The case of cache memory

    Science.gov (United States)

    Strzałka, Dominik; Dymora, Paweł; Mazurek, Mirosław

    2018-02-01

    In this paper we present some preliminary results in the field of computer systems management with relation to Tsallis thermostatistics and the ubiquitous problem of hardware limited resources. In the case of systems with non-deterministic behaviour, management of their resources is a key point that guarantees theirs acceptable performance and proper working. This is very wide problem that stands for many challenges in financial, transport, water and food, health, etc. areas. We focus on computer systems with attention paid to cache memory and propose to use an analytical model that is able to connect non-extensive entropy formalism, long-range dependencies, management of system resources and queuing theory. Obtained analytical results are related to the practical experiment showing interesting and valuable results.

  16. Universal resources for approximate and stochastic measurement-based quantum computation

    International Nuclear Information System (INIS)

    Mora, Caterina E.; Piani, Marco; Miyake, Akimasa; Van den Nest, Maarten; Duer, Wolfgang; Briegel, Hans J.

    2010-01-01

    We investigate which quantum states can serve as universal resources for approximate and stochastic measurement-based quantum computation in the sense that any quantum state can be generated from a given resource by means of single-qubit (local) operations assisted by classical communication. More precisely, we consider the approximate and stochastic generation of states, resulting, for example, from a restriction to finite measurement settings or from possible imperfections in the resources or local operations. We show that entanglement-based criteria for universality obtained in M. Van den Nest et al. [New J. Phys. 9, 204 (2007)] for the exact, deterministic case can be lifted to the much more general approximate, stochastic case. This allows us to move from the idealized situation (exact, deterministic universality) considered in previous works to the practically relevant context of nonperfect state preparation. We find that any entanglement measure fulfilling some basic requirements needs to reach its maximum value on some element of an approximate, stochastic universal family of resource states, as the resource size grows. This allows us to rule out various families of states as being approximate, stochastic universal. We prove that approximate, stochastic universality is in general a weaker requirement than deterministic, exact universality and provide resources that are efficient approximate universal, but not exact deterministic universal. We also study the robustness of universal resources for measurement-based quantum computation under realistic assumptions about the (imperfect) generation and manipulation of entangled states, giving an explicit expression for the impact that errors made in the preparation of the resource have on the possibility to use it for universal approximate and stochastic state preparation. Finally, we discuss the relation between our entanglement-based criteria and recent results regarding the uselessness of states with a high

  17. The Role of the Radiation Safety Information Computational Center (RSICC) in Knowledge Management

    International Nuclear Information System (INIS)

    Valentine, T.

    2016-01-01

    Full text: The Radiation Safety Information Computational Center (RSICC) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 packages that have been provided by contributors from various agencies. RSICC’s customers obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to help ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programmes both domestically and internationally, as the majority of RSICC’s customers are students attending U.S. universities. RSICC also supports and promotes workshops and seminars in nuclear science and technology to further the use and/or development of computational tools and data. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC’s activities, services, and systems that support knowledge management and education and training in the nuclear field. (author

  18. Current state and future direction of computer systems at NASA Langley Research Center

    Science.gov (United States)

    Rogers, James L. (Editor); Tucker, Jerry H. (Editor)

    1992-01-01

    Computer systems have advanced at a rate unmatched by any other area of technology. As performance has dramatically increased there has been an equally dramatic reduction in cost. This constant cost performance improvement has precipitated the pervasiveness of computer systems into virtually all areas of technology. This improvement is due primarily to advances in microelectronics. Most people are now convinced that the new generation of supercomputers will be built using a large number (possibly thousands) of high performance microprocessors. Although the spectacular improvements in computer systems have come about because of these hardware advances, there has also been a steady improvement in software techniques. In an effort to understand how these hardware and software advances will effect research at NASA LaRC, the Computer Systems Technical Committee drafted this white paper to examine the current state and possible future directions of computer systems at the Center. This paper discusses selected important areas of computer systems including real-time systems, embedded systems, high performance computing, distributed computing networks, data acquisition systems, artificial intelligence, and visualization.

  19. Annual report of Nuclear Human Resource Development Center. April 1, 2015 - March 31, 2016

    International Nuclear Information System (INIS)

    2017-07-01

    This annual report summarizes the activities of Nuclear Human Resource Development Center (NuHRDeC) of Japan Atomic Energy Agency (JAEA) in the fiscal year (FY) 2015. In FY 2015, we were actively engaged in organizing special training courses in response to external training needs, cooperating with universities, and offering international training courses for Asian countries in addition to the regular training programs at NuHRDeC. In accordance to the annual plan for national training, we conducted training courses for radioisotopes and radiation engineers, nuclear energy engineers, and national qualification examinations, as well as for officials in Nuclear Regulatory Authority and prefectural and municipal officials in Fukushima as outreach activities in order to meet the training needs from the external organizations. We continued to enhance cooperative activities with universities, such as the acceptance of postdoctoral researchers, the cooperation according to the cooperative graduate school system, including the acceptance of students from Nuclear Professional School of University of Tokyo. Furthermore, through utilizing the remote education system, the joint course was successfully held with seven universities, and the intensive summer course and the practical exercise at Nuclear Fuel Cycle Engineering Laboratories were also conducted as part of the collaboration network with universities. The Instructor Training Program (ITP) was continually offered to the ITP participating countries (Bangladesh, China, Indonesia, Kazakhstan, Malaysia, Mongolia, Philippines, Saudi Arabia, Sri Lanka, Thailand, Turkey and Viet Nam) in FY2015 under contact with Ministry of Education, Culture, Sports, Science and Technology. As part of the ITP, the Instructor Training Course and the Nuclear Technology Seminar were organized at NuHRDeC such as “Reactor Engineering Course” and “Basic Radiation Knowledge for School Education Seminar”. Eight and eleven countries

  20. The Earth Resources Observation Systems data center's training technical assistance, and applications research activities

    Science.gov (United States)

    Sturdevant, J.A.

    1981-01-01

    The Earth Resources Observation Systems (EROS) Data Center (EDO, administered by the U.S. Geological Survey, U.S. Department of the Interior, provides remotely sensed data to the user community and offers a variety of professional services to further the understanding and use of remote sensing technology. EDC reproduces and sells photographic and electronic copies of satellite images of areas throughout the world. Other products include aerial photographs collected by 16 organizations, including the U.S. Geological Survey and the National Aeronautics and Space Administration. Primary users of the remotely sensed data are Federal, State, and municipal government agencies, universities, foreign nations, and private industries. The professional services available at EDC are primarily directed at integrating satellite and aircraft remote sensing technology into the programs of the Department of the Interior and its cooperators. This is accomplished through formal training workshops, user assistance, cooperative demonstration projects, and access to equipment and capabilities in an advanced data analysis laboratory. In addition, other Federal agencies, State and local governments, universities, and the general public can get assistance from the EDC Staff. Since 1973, EDC has contributed to the accelerating growth in development and operational use of remotely sensed data for land resource problems through its role as educator and by conducting basic and applied remote sensing applications research. As remote sensing technology continues to evolve, EDC will continue to respond to the increasing demand for timely information on remote sensing applications. Questions most often asked about EDC's research and training programs include: Who may attend an EDC remote sensing training course? Specifically, what is taught? Who may cooperate with EDC on remote sensing projects? Are interpretation services provided on a service basis? This report attempts to define the goals and

  1. Heuristic evaluation of online COPD respiratory therapy and education video resource center.

    Science.gov (United States)

    Stellefson, Michael; Chaney, Beth; Chaney, Don

    2014-10-01

    Abstract Purpose: Because of limited accessibility to pulmonary rehabilitation programs, patients with chronic obstructive pulmonary disease (COPD) are infrequently provided with patient education resources. To help educate patients with COPD on how to live a better life with diminished breathing capacity, we developed a novel social media resource center containing COPD respiratory therapy and education videos called "COPDFlix." A heuristic evaluation of COPDFlix was conducted as part of a larger study to determine whether the prototype was successful in adhering to formal Web site usability guidelines for older adults. A purposive sample of three experts, with expertise in Web design and health communications technology, was recruited (a) to identify usability violations and (b) to propose solutions to improve the functionality of the COPDFlix prototype. Each expert evaluated 18 heuristics in four categories of task-based criteria (i.e., interaction and navigation, information architecture, presentation design, and information design). Seventy-six subcriteria across these four categories were assessed. Quantitative ratings and qualitative comments from each expert were compiled into a single master list, noting the violated heuristic and type/location of problem(s). Sixty-one usability violations were identified across the 18 heuristics. Evaluators rated the majority of heuristic subcriteria as either a "minor hindrance" (n=32) or "no problem" (n=132). Moreover, only 2 of the 18 heuristic categories were noted as "major" violations, with mean severity scores of ≥3. Mixed-methods data analysis helped the multidisciplinary research team to categorize and prioritize usability problems and solutions, leading to 26 discrete design modifications within the COPDFlix prototype.

  2. The Effects of Yoga, Massage, and Reiki on Patient Well-Being at a Cancer Resource Center.

    Science.gov (United States)

    Rosenbaum, Mark S; Velde, Jane

    2016-06-01

    Cancer resource centers offer patients a variety of therapeutic services. However, patients with cancer and cancer healthcare practitioners may not fully understand the specific objectives and benefits of each service. This research offers guidance to cancer healthcare practitioners on how they can best direct patients to partake in specific integrative therapies, depending on their expressed needs. This article investigates the effects of yoga, massage, and Reiki services administered in a cancer resource center on patients' sense of personal well-being. The results show how program directors at a cancer resource center can customize therapies to meet the needs of patients' well-being. The experimental design measured whether engaging in yoga, massage, or Reiki services affects the self-perceived well-being of 150 patients at a cancer resource center at two times. All three services helped decrease stress and anxiety, improve mood, and enhance cancer center patrons' perceived overall health and quality of life in a similar manner. Reiki reduced the pain of patients with cancer to a greater extent than either massage or yoga.

  3. Radiation Shielding Information Center: a source of computer codes and data for fusion neutronics studies

    International Nuclear Information System (INIS)

    McGill, B.L.; Roussin, R.W.; Trubey, D.K.; Maskewitz, B.F.

    1980-01-01

    The Radiation Shielding Information Center (RSIC), established in 1962 to collect, package, analyze, and disseminate information, computer codes, and data in the area of radiation transport related to fission, is now being utilized to support fusion neutronics technology. The major activities include: (1) answering technical inquiries on radiation transport problems, (2) collecting, packaging, testing, and disseminating computing technology and data libraries, and (3) reviewing literature and operating a computer-based information retrieval system containing material pertinent to radiation transport analysis. The computer codes emphasize methods for solving the Boltzmann equation such as the discrete ordinates and Monte Carlo techniques, both of which are widely used in fusion neutronics. The data packages include multigroup coupled neutron-gamma-ray cross sections and kerma coefficients, other nuclear data, and radiation transport benchmark problem results

  4. The Role of Computers in Research and Development at Langley Research Center

    Science.gov (United States)

    Wieseman, Carol D. (Compiler)

    1994-01-01

    This document is a compilation of presentations given at a workshop on the role cf computers in research and development at the Langley Research Center. The objectives of the workshop were to inform the Langley Research Center community of the current software systems and software practices in use at Langley. The workshop was organized in 10 sessions: Software Engineering; Software Engineering Standards, methods, and CASE tools; Solutions of Equations; Automatic Differentiation; Mosaic and the World Wide Web; Graphics and Image Processing; System Design Integration; CAE Tools; Languages; and Advanced Topics.

  5. Perspectives on Sharing Models and Related Resources in Computational Biomechanics Research.

    Science.gov (United States)

    Erdemir, Ahmet; Hunter, Peter J; Holzapfel, Gerhard A; Loew, Leslie M; Middleton, John; Jacobs, Christopher R; Nithiarasu, Perumal; Löhner, Rainlad; Wei, Guowei; Winkelstein, Beth A; Barocas, Victor H; Guilak, Farshid; Ku, Joy P; Hicks, Jennifer L; Delp, Scott L; Sacks, Michael; Weiss, Jeffrey A; Ateshian, Gerard A; Maas, Steve A; McCulloch, Andrew D; Peng, Grace C Y

    2018-02-01

    The role of computational modeling for biomechanics research and related clinical care will be increasingly prominent. The biomechanics community has been developing computational models routinely for exploration of the mechanics and mechanobiology of diverse biological structures. As a result, a large array of models, data, and discipline-specific simulation software has emerged to support endeavors in computational biomechanics. Sharing computational models and related data and simulation software has first become a utilitarian interest, and now, it is a necessity. Exchange of models, in support of knowledge exchange provided by scholarly publishing, has important implications. Specifically, model sharing can facilitate assessment of reproducibility in computational biomechanics and can provide an opportunity for repurposing and reuse, and a venue for medical training. The community's desire to investigate biological and biomechanical phenomena crossing multiple systems, scales, and physical domains, also motivates sharing of modeling resources as blending of models developed by domain experts will be a required step for comprehensive simulation studies as well as the enhancement of their rigor and reproducibility. The goal of this paper is to understand current perspectives in the biomechanics community for the sharing of computational models and related resources. Opinions on opportunities, challenges, and pathways to model sharing, particularly as part of the scholarly publishing workflow, were sought. A group of journal editors and a handful of investigators active in computational biomechanics were approached to collect short opinion pieces as a part of a larger effort of the IEEE EMBS Computational Biology and the Physiome Technical Committee to address model reproducibility through publications. A synthesis of these opinion pieces indicates that the community recognizes the necessity and usefulness of model sharing. There is a strong will to facilitate

  6. Annual report of Nuclear Human Resource Development Center. April 1, 2014 - March 31, 2015

    International Nuclear Information System (INIS)

    2017-06-01

    This annual report summarizes the activities of Nuclear Human Resource Development Center (NuHRDeC) of Japan Atomic Energy Agency (JAEA) in the fiscal year (FY) 2014. In FY 2014, we flexibly designed special training courses corresponding with the outside training needs, while organizing the annually scheduled regular training programs. We also actively addressed the challenging issues on human resource development, such as to enhance the collaboration with academia and to organize international training for Asian countries. Besides these regular courses, we also organized the special training courses based on the outside needs, e.g. Nuclear Regulatory Authority or the people in Naraha town in Fukushima Prefecture. JAEA continued its cooperative activities with universities. In respect of the cooperation with graduate school of The University of Tokyo, we accepted nuclear major students and cooperatively conducted lectures and practical exercises for one year. In terms of the collaboration network with universities, the joint course was successfully held with six universities through utilizing the remote education system. Besides, the intensive summer course and practical exercise at Nuclear Fuel Cycle Engineering Laboratories were also conducted. Furthermore, JAEA had re-signed the agreement “Japan Nuclear Education Network” with 7 Universities in Feb. 2015 for the new participation of Nagoya University from FY 2015. Concerning International training, we continuously implemented the Instructor Training Program (ITP) by receiving the annual sponsorship from Ministry of Education, Culture, Sports, Science and Technology. In FY 2014, eight countries (i.e. Bangladesh, Indonesia, Kazakhstan, Malaysia, Mongolia, Philippines, Thailand and Vietnam) joined this Instructor training courses such as “Reactor Engineering Course”. Furthermore, we organized nuclear technology seminar courses, e.g. “Basic Radiation Knowledge for School Education”. In respect of

  7. Lessons Learned from Creating the Public Earthquake Resource Center at CERI

    Science.gov (United States)

    Patterson, G. L.; Michelle, D.; Johnston, A.

    2004-12-01

    The Center for Earthquake Research and Information (CERI) at the University of Memphis opened the Public Earthquake Resource Center (PERC) in May 2004. The PERC is an interactive display area that was designed to increase awareness of seismology, Earth Science, earthquake hazards, and earthquake engineering among the general public and K-12 teachers and students. Funding for the PERC is provided by the US Geological Survey, The NSF-funded Mid America Earthquake Center, and the University of Memphis, with input from the Incorporated Research Institutions for Seismology. Additional space at the facility houses local offices of the US Geological Survey. PERC exhibits are housed in a remodeled residential structure at CERI that was donated by the University of Memphis and the State of Tennessee. Exhibits were designed and built by CERI and US Geological Survey staff and faculty with the help of experienced museum display subcontractors. The 600 square foot display area interactively introduces the basic concepts of seismology, real-time seismic information, seismic network operations, paleoseismology, building response, and historical earthquakes. Display components include three 22" flat screen monitors, a touch sensitive monitor, 3 helicorder elements, oscilloscope, AS-1 seismometer, life-sized liquefaction trench, liquefaction shake table, and building response shake table. All displays include custom graphics, text, and handouts. The PERC website at www.ceri.memphis.edu/perc also provides useful information such as tour scheduling, ask a geologist, links to other institutions, and will soon include a virtual tour of the facility. Special consideration was given to address State science standards for teaching and learning in the design of the displays and handouts. We feel this consideration is pivotal to the success of any grass roots Earth Science education and outreach program and represents a valuable lesson that has been learned at CERI over the last several

  8. Integrating GRID tools to build a computing resource broker: activities of DataGrid WP1

    International Nuclear Information System (INIS)

    Anglano, C.; Barale, S.; Gaido, L.; Guarise, A.; Lusso, S.; Werbrouck, A.

    2001-01-01

    Resources on a computational Grid are geographically distributed, heterogeneous in nature, owned by different individuals or organizations with their own scheduling policies, have different access cost models with dynamically varying loads and availability conditions. This makes traditional approaches to workload management, load balancing and scheduling inappropriate. The first work package (WP1) of the EU-funded DataGrid project is addressing the issue of optimizing the distribution of jobs onto Grid resources based on a knowledge of the status and characteristics of these resources that is necessarily out-of-date (collected in a finite amount of time at a very loosely coupled site). The authors describe the DataGrid approach in integrating existing software components (from Condor, Globus, etc.) to build a Grid Resource Broker, and the early efforts to define a workable scheduling strategy

  9. Implementation an human resources shared services center: Multinational company strategy in fusion context

    Directory of Open Access Journals (Sweden)

    João Paulo Bittencourt

    2016-09-01

    Full Text Available The aim of this research was to analyze the process of implementation and management of the Shared Services Center for Human Resources, in a multinational company in the context of mergers and acquisitions. The company analyzed was called here Alpha, and is one of the largest food companies in the country that was born of a merger between Beta and Delta in 2008. The CSC may constitute a tool for strategic management of HR that allows repositioning of the role of the area in order to be more strategic at corporate level and more profitable at the operating level. The research was based on a descriptive and exploratory study of qualitative approach. Among the results, there is the fact that shared services were strategic to support, standardize and ensure the expansion of the company. The challenges found were associated with the development of a culture of service and the relationship with users and the definition of HR activities scope. The following management procedures include the adequacy of wage differences between employees, the career path limitation and the need to attract and retain talent and international expansion.

  10. The International Center for Integrated Water Resources Management (ICIWaRM): The United States' Contribution to UNESCO IHP's Global Network of Water Centers

    Science.gov (United States)

    Logan, W. S.

    2015-12-01

    The concept of a "category 2 center"—i.e., one that is closely affiliated with UNESCO, but not legally part of UNESCO—dates back many decades. However, only in the last decade has the concept been fully developed. Within UNESCO, the International Hydrological Programme (IHP) has led the way in creating a network of regional and global water-related centers.ICIWaRM—the International Center for Integrated Water Resources Management—is one member of this network. Approved by UNESCO's General Conference, the center has been operating since 2009. It was designed to fill a niche in the system for a center that was backed by an institution with on-the-ground water management experience, but that also had strong connections to academia, NGOs and other governmental agencies. Thus, ICIWaRM is hosted by the US Army Corps of Engineers' Institute for Water Resources (IWR), but established with an internal network of partner institutions. Three main factors have contributed to any success that ICIWaRM has achieved in its global work: A focus on practical science and technology which can be readily transferred. This includes the Corps' own methodologies and models for planning and water management, and those of our university and government partners. Collaboration with other UNESCO Centers on joint applied research, capacity-building and training. A network of centers needs to function as a network, and ICIWaRM has worked together with UNESCO-affiliated centers in Chile, Brazil, Paraguay, the Dominican Republic, Japan, China, and elsewhere. Partnering with and supporting existing UNESCO-IHP programs. ICIWaRM serves as the Global Technical Secretariat for IHP's Global Network on Water and Development Information in Arid Lands (G-WADI). In addition to directly supporting IHP, work through G-WADI helps the center to frame, prioritize and integrate its activities. With the recent release of the United Nation's 2030 Agenda for Sustainable Development, it is clear that

  11. The Bone Marrow Transplantation Center of the National Cancer Institute - its resources to assist patients with bone marrow failure

    International Nuclear Information System (INIS)

    Tabak, Daniel

    1997-01-01

    This paper describes the bone marrow transplantation center of the brazilian National Cancer Institute, which is responsible for the cancer control in Brazil. The document also describes the resources available in the Institute for assisting patients presenting bone marrow failures. The Center provides for allogeneic and autologous bone marrow transplants, peripheral stem cell transplants, umbilical cord collections and transplants, and a small experience with unrelated bone marrow transplants. The Center receives patient from all over the country and provides very sophisticated medical care at no direct cost to the patients

  12. Teaching Scientific Computing: A Model-Centered Approach to Pipeline and Parallel Programming with C

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2015-01-01

    Full Text Available The aim of this study is to present an approach to the introduction into pipeline and parallel computing, using a model of the multiphase queueing system. Pipeline computing, including software pipelines, is among the key concepts in modern computing and electronics engineering. The modern computer science and engineering education requires a comprehensive curriculum, so the introduction to pipeline and parallel computing is the essential topic to be included in the curriculum. At the same time, the topic is among the most motivating tasks due to the comprehensive multidisciplinary and technical requirements. To enhance the educational process, the paper proposes a novel model-centered framework and develops the relevant learning objects. It allows implementing an educational platform of constructivist learning process, thus enabling learners’ experimentation with the provided programming models, obtaining learners’ competences of the modern scientific research and computational thinking, and capturing the relevant technical knowledge. It also provides an integral platform that allows a simultaneous and comparative introduction to pipelining and parallel computing. The programming language C for developing programming models and message passing interface (MPI and OpenMP parallelization tools have been chosen for implementation.

  13. The psychology of computer displays in the modern mission control center

    Science.gov (United States)

    Granaas, Michael M.; Rhea, Donald C.

    1988-01-01

    Work at NASA's Western Aeronautical Test Range (WATR) has demonstrated the need for increased consideration of psychological factors in the design of computer displays for the WATR mission control center. These factors include color perception, memory load, and cognitive processing abilities. A review of relevant work in the human factors psychology area is provided to demonstrate the need for this awareness. The information provided should be relevant in control room settings where computerized displays are being used.

  14. Knowledge management: Role of the the Radiation Safety Information Computational Center (RSICC)

    Science.gov (United States)

    Valentine, Timothy

    2017-09-01

    The Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 software packages that have been provided by code developers from various federal and international agencies. RSICC's customers (scientists, engineers, and students from around the world) obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programs both domestically and internationally, as the majority of RSICC's customers are students attending U.S. universities. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC's activities, services, and systems that support knowledge management and education and training in the nuclear field.

  15. The Amarillo National Resource Center for Plutonium. Quarterly progress detailed report, 1 November 1996--31 January 1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    Progress for this quarter is given for each of the following Center programs: (1) plutonium information resource; (2) advisory function (DOE and state support); (3) environmental, public health and safety; (3) communication, education, and training; and (4) nuclear and other material studies. Both summaries of the activities and detailed reports are included.

  16. The ATLAS Computing Agora: a resource web site for citizen science projects

    CERN Document Server

    Bourdarios, Claire; The ATLAS collaboration

    2016-01-01

    The ATLAS collaboration has recently setup a number of citizen science projects which have a strong IT component and could not have been envisaged without the growth of general public computing resources and network connectivity: event simulation through volunteer computing, algorithms improvement via Machine Learning challenges, event display analysis on citizen science platforms, use of open data, etc. Most of the interactions with volunteers are handled through message boards, but specific outreach material was also developed, giving an enhanced visibility to the ATLAS software and computing techniques, challenges and community. In this talk the Atlas Computing Agora (ACA) web platform will be presented as well as some of the specific material developed for some of the projects.

  17. Annual report of Nuclear Human Resource Development Center. April 1, 2011 - March 31, 2012

    International Nuclear Information System (INIS)

    2013-11-01

    This annual report summarizes the activities of Nuclear Human Resource Development Center (NuHRDeC) of Japan Atomic Energy Agency (JAEA) in the fiscal year 2011. In this fiscal year, we flexibly designed and conducted training courses corresponding with the needs from outside, while conducting the annually scheduled training programs, and also actively addressed the challenge of human resource development, such as to enhance the collaboration with academia and to organize international training for Asian countries. The number of trainees who completed the domestic training courses in 2011 was increased to 387, which is 14 percent more than the previous year. And also, in order to respond to the Tokyo Electric Power Company (TEPCO)'s Fukushima No.1 nuclear power plant accident, we also newly designed and organized the special training courses on radiation survey for the subcontracting companies working with TEPCO, and the training courses on decontamination work for the construction companies in Fukushima prefecture. The total number of attendees in these special courses was 3,800 persons. JAEA continued its cooperative activities with universities. In respect of the cooperation with graduate school of University of Tokyo, we accepted 17 students and cooperatively conducted practical exercises for nuclear major. Furthermore, we also actively continued cooperation on practical exercises for students of universities which were signed in Nuclear HRD Program. In terms of the collaboration network with universities, the joint course was held with six universities through utilizing the remote education system. Furthermore, the intensive course at Okayama University and practical exercise at Nuclear Fuel Cycle Engineering Laboratories of JAEA were also conducted. In respect of International training, NuHRDeC continuously implemented the Instructor Training Program (ITP) by receiving the annual sponsorship from MEXT. In fiscal year 2011, seven countries (i.e. Bangladesh

  18. Annual report of Nuclear Human Resource Development Center. April 1, 2013 - March 31, 2014

    International Nuclear Information System (INIS)

    2015-07-01

    This annual report summarizes the activities of Nuclear Human Resource Development Center (NuHRDeC) of Japan Atomic Energy Agency (JAEA) in the FY2013. In FY2013, we flexibly designed special training courses corresponding with the outside training needs, while organizing the annually scheduled regular training programs. We also actively addressed the challenging issues on human resource development, such as to enhance the collaboration with academia and to organize international training for Asian countries. The number of trainees who participated in the domestic regular training courses in 2013 was more than 300 persons. Besides these regular courses, we also organized the special training courses based on the outside needs, e.g. the training courses on radiation survey and decontamination work in Fukushima prefecture for the subcontracting companies of the Tokyo Electric Power Company (TEPCO) working to respond to the TEPCO's Fukushima Daiichi nuclear power station accident. JAEA continued its cooperative activities with universities. In respect of the cooperation with graduate school of University of Tokyo, we accepted nuclear major students and cooperatively conducted lectures and practical exercises for one year. In terms of the collaboration network with universities, the joint course was successfully held with six universities through utilizing the remote education system. Furthermore, the intensive course at Okayama University, University of Fukui, and practical exercise at Nuclear Fuel Cycle Engineering Laboratories of JAEA were also conducted. In respect of International training, we continuously implemented the Instructor Training Program (ITP) by receiving the annual sponsorship from Ministry of Education, Culture, Sports, Science and Technology. In fiscal year 2013, eight countries (i.e. Bangladesh, Indonesia, Kazakhstan, Malaysia, Mongolia, Philippines, Thailand, Vietnam) joined this Instructor training courses. Furthermore, we organized nuclear

  19. Annual report of Nuclear Human Resource Development Center. April 1, 2010 - March 31, 2011

    International Nuclear Information System (INIS)

    2012-03-01

    This annual report summarizes the activities of Nuclear Human Resource Development Center (NuHRDeC) of Japan Atomic Energy Agency (JAEA) in the fiscal year 2010. In this fiscal year, NuHRDeC flexibly designed and conducted as need training courses upon requests while conducting the annually scheduled training programs, and actively addressed the challenge of human resource development, such as to enhance the collaboration with academia and to expand the number of participating countries for international training. The number of trainees who completed the domestic training courses in 2010 was slightly increased to 340, which is 6 percent more than the previous year. The number of those who completed the staff technical training courses was 879 in 2010, which is 12 percent more than the previous year. As a result, the total number of trainees during this period is about 10 percent more than the previous year. In order to correspond with the needs from outside of JAEA, four temporary courses were held upon the request from Nuclear and Industrial Safety Agency (NISA), Ministry of Economy, Trade and Industry (METI). JAEA continued its cooperative activities with universities; cooperation with graduate school of University of Tokyo, and the cooperative graduate school program was enlarged to cooperate with totally 19 graduate schools, one faculty of undergraduate school, and one technical college, including the newly joined 1 graduate school in 2010. JAEA also continued cooperative activities with Nuclear HRD Program initiated by MEXT and METI in 2007. The joint course has continued networking with six universities through utilizing the remote education system, Japan Nuclear Education Network (JNEN), and special lectures, summer and winter practice were also conducted. In respect of International training, NuHRDeC continuously implemented the Instructor Training Program (ITP) by receiving the annual sponsorship from MEXT. In fiscal year 2010, four countries (Bangladesh

  20. Annual report of Nuclear Human Resource Development Center. April 1, 2012 - March 31, 2013

    International Nuclear Information System (INIS)

    2014-03-01

    This annual report summarizes the activities of Nuclear Human Resource Development Center (NuHRDeC) of Japan Atomic Energy Agency (JAEA) in the fiscal year 2012. In this fiscal year, we flexibly designed training courses corresponding with the needs from outside, while organizing the annually scheduled training programs, and also actively addressed the challenging issues on human resource development, such as to enhance the collaboration with academia and to organize international training for Asian countries. The number of trainees who completed the domestic training courses in 2012 was increased to 525, which is 30 percent more than the previous year. And also, in order to respond to the Tokyo Electric Power Company (TEPCO)'s Fukushima No.1 nuclear power plant accident, we also organized the special training courses on radiation survey for the subcontracting companies working with TEPCO, and the training courses on decontamination work for the construction companies in Fukushima prefecture. The total number of attendees in these special courses was more than 4,000 persons. JAEA continued its cooperative activities with universities. In respect of the cooperation with graduate school of University of Tokyo, we accepted 14 students and cooperatively conducted practical exercises for nuclear major. Furthermore, we also actively continued cooperation on practical exercises for students of universities which were signed in Nuclear HRD Program. In terms of the collaboration network with universities, the joint course was held with six universities through utilizing the remote education system. Furthermore, the intensive course at Okayama University, Fukui University, and practical exercise at Nuclear Fuel Cycle Engineering Laboratories of JAEA were also conducted. In respect of International training, NuHRDeC continuously implemented the Instructor Training Program (ITP) by receiving the annual sponsorship from MEXT. In fiscal year 2012, eight countries (i

  1. The retention of health human resources in primary healthcare centers in Lebanon: a national survey.

    Science.gov (United States)

    Alameddine, Mohamad; Saleh, Shadi; El-Jardali, Fadi; Dimassi, Hani; Mourad, Yara

    2012-11-22

    Critical shortages of health human resources (HHR), associated with high turnover rates, have been a concern in many countries around the globe. Of particular interest is the effect of such a trend on the primary healthcare (PHC) sector; considered a cornerstone in any effective healthcare system. This study is a rare attempt to investigate PHC HHR work characteristics, level of burnout and likelihood to quit as well as the factors significantly associated with staff retention at PHC centers in Lebanon. A cross-sectional design was utilized to survey all health providers at 81 PHC centers dispersed in all districts of Lebanon. The questionnaire consisted of four sections: socio-demographic/ professional background, organizational/institutional characteristics, likelihood to quit and level of professional burnout (using the Maslach-Burnout Inventory). A total of 755 providers completed the questionnaire (60.5% response rate). Bivariate analyses and multinomial logistic regression were used to determine factors associated with likelihood to quit. Two out of five respondents indicated likelihood to quit their jobs within the next 1-3 years and an additional 13.4% were not sure about quitting. The top three reasons behind likelihood to quit were poor salary (54.4%), better job opportunities outside the country (35.1%) and lack of professional development (33.7%). A U-shaped relationship was observed between age and likelihood to quit. Regression analysis revealed that high levels of burnout, lower level of education and low tenure were all associated with increased likelihood to quit. The study findings reflect an unstable workforce and are not conducive to supporting an expanded role for PHC in the Lebanese healthcare system. While strategies aiming at improving staff retention would be important to develop and implement for all PHC HHR; targeted retention initiatives should focus on the young-new recruits and allied health professionals. Particular attention should

  2. The retention of health human resources in primary healthcare centers in Lebanon: a national survey

    Directory of Open Access Journals (Sweden)

    Alameddine Mohamad

    2012-11-01

    Full Text Available Abstract Background Critical shortages of health human resources (HHR, associated with high turnover rates, have been a concern in many countries around the globe. Of particular interest is the effect of such a trend on the primary healthcare (PHC sector; considered a cornerstone in any effective healthcare system. This study is a rare attempt to investigate PHC HHR work characteristics, level of burnout and likelihood to quit as well as the factors significantly associated with staff retention at PHC centers in Lebanon. Methods A cross-sectional design was utilized to survey all health providers at 81 PHC centers dispersed in all districts of Lebanon. The questionnaire consisted of four sections: socio-demographic/ professional background, organizational/institutional characteristics, likelihood to quit and level of professional burnout (using the Maslach-Burnout Inventory. A total of 755 providers completed the questionnaire (60.5% response rate. Bivariate analyses and multinomial logistic regression were used to determine factors associated with likelihood to quit. Results Two out of five respondents indicated likelihood to quit their jobs within the next 1–3 years and an additional 13.4% were not sure about quitting. The top three reasons behind likelihood to quit were poor salary (54.4%, better job opportunities outside the country (35.1% and lack of professional development (33.7%. A U-shaped relationship was observed between age and likelihood to quit. Regression analysis revealed that high levels of burnout, lower level of education and low tenure were all associated with increased likelihood to quit. Conclusions The study findings reflect an unstable workforce and are not conducive to supporting an expanded role for PHC in the Lebanese healthcare system. While strategies aiming at improving staff retention would be important to develop and implement for all PHC HHR; targeted retention initiatives should focus on the young-new recruits

  3. Advances in ATLAS@Home towards a major ATLAS computing resource

    CERN Document Server

    Cameron, David; The ATLAS collaboration

    2018-01-01

    The volunteer computing project ATLAS@Home has been providing a stable computing resource for the ATLAS experiment since 2013. It has recently undergone some significant developments and as a result has become one of the largest resources contributing to ATLAS computing, by expanding its scope beyond traditional volunteers and into exploitation of idle computing power in ATLAS data centres. Removing the need for virtualization on Linux and instead using container technology has made the entry barrier significantly lower data centre participation and in this paper, we describe the implementation and results of this change. We also present other recent changes and improvements in the project. In early 2017 the ATLAS@Home project was merged into a combined LHC@Home platform, providing a unified gateway to all CERN-related volunteer computing projects. The ATLAS Event Service shifts data processing from file-level to event-level and we describe how ATLAS@Home was incorporated into this new paradigm. The finishing...

  4. Critical phenomena in communication/computation networks with various topologies and suboptimal to optimal resource allocation

    Science.gov (United States)

    Cogoni, Marco; Busonera, Giovanni; Anedda, Paolo; Zanetti, Gianluigi

    2015-01-01

    We generalize previous studies on critical phenomena in communication networks [1,2] by adding computational capabilities to the nodes. In our model, a set of tasks with random origin, destination and computational structure is distributed on a computational network, modeled as a graph. By varying the temperature of a Metropolis Montecarlo, we explore the global latency for an optimal to suboptimal resource assignment at a given time instant. By computing the two-point correlation function for the local overload, we study the behavior of the correlation distance (both for links and nodes) while approaching the congested phase: a transition from peaked to spread g(r) is seen above a critical (Montecarlo) temperature Tc. The average latency trend of the system is predicted by averaging over several network traffic realizations while maintaining a spatially detailed information for each node: a sharp decrease of performance is found over Tc independently of the workload. The globally optimized computational resource allocation and network routing defines a baseline for a future comparison of the transition behavior with respect to existing routing strategies [3,4] for different network topologies.

  5. The NILE system architecture: fault-tolerant, wide-area access to computing and data resources

    International Nuclear Information System (INIS)

    Ricciardi, Aleta; Ogg, Michael; Rothfus, Eric

    1996-01-01

    NILE is a multi-disciplinary project building a distributed computing environment for HEP. It provides wide-area, fault-tolerant, integrated access to processing and data resources for collaborators of the CLEO experiment, though the goals and principles are applicable to many domains. NILE has three main objectives: a realistic distributed system architecture design, the design of a robust data model, and a Fast-Track implementation providing a prototype design environment which will also be used by CLEO physicists. This paper focuses on the software and wide-area system architecture design and the computing issues involved in making NILE services highly-available. (author)

  6. SPA AND CLIMATIC RESORTS (CENTERS AS RESOURCES OF PROGRAM OF SPORT RECREATION IMPLEMENTATION

    Directory of Open Access Journals (Sweden)

    Ivica Nikolić

    2006-06-01

    Full Text Available The aspiration of the civilized man is the improvement of work which aim is to achieve as big as possible effect of productivity and as small as possible participation of labour. The result of this process, which cannot be avoided, is some kind of fatigue that has hypocinaesiological characteristics in regard to demands of modern work process. The most effective way to fight against fatigue is to have an active holiday that is meaningfully programmed, led and carried out through movement of tourists, with the addition of natural factors, among which climate and healing waters are particularly important. These very resources characterize the tourist potential of Serbia and Montenegro with lots of available facilities at 1000 m height above the sea level and spa centers with springs and a complete offer physio-prophylactic procedures and following facilities for sport recreation. The implementation of programmed active holidays in to the corpus of tourist offer of Serbia and Montenegro represents prospective of development of tourism and tourist economy with effects of multiple importance as for participants, so for the level of tourist consumption. That will definitely influence the lengthening of tourist season as the primary goal of every catering establishment. Surveys show that the affection and viewpoints of potential tourists are especially directed towards engaging sport games and activities on and in the water, as part of the elementary tourist offer in spas and climatic resorts and their available facilities. Recommendationsand postulates of program of sport recreation, which are presented through four charts, are the basis of marketing strategy of appearance on tourist market with permanent education of management personnel and further research of potential market expanding. The publication and distribution of advertising materials are especially important, both at the market in our country and at the foreign market, where the abundance

  7. The NIH-NIAID Schistosomiasis Resource Center at the Biomedical Research Institute: Molecular Redux.

    Directory of Open Access Journals (Sweden)

    James J Cody

    2016-10-01

    Full Text Available Schistosomiasis remains a health burden in many parts of the world. The complex life cycle of Schistosoma parasites and the economic and societal conditions present in endemic areas make the prospect of eradication unlikely in the foreseeable future. Continued and vigorous research efforts must therefore be directed at this disease, particularly since only a single World Health Organization (WHO-approved drug is available for treatment. The National Institutes of Health (NIH-National Institute of Allergy and Infectious Diseases (NIAID Schistosomiasis Resource Center (SRC at the Biomedical Research Institute provides investigators with the critical raw materials needed to carry out this important research. The SRC makes available, free of charge (including international shipping costs, not only infected host organisms but also a wide array of molecular reagents derived from all life stages of each of the three main human schistosome parasites. As the field of schistosomiasis research rapidly advances, it is likely to become increasingly reliant on omics, transgenics, epigenetics, and microbiome-related research approaches. The SRC has and will continue to monitor and contribute to advances in the field in order to support these research efforts with an expanding array of molecular reagents. In addition to providing investigators with source materials, the SRC has expanded its educational mission by offering a molecular techniques training course and has recently organized an international schistosomiasis-focused meeting. This review provides an overview of the materials and services that are available at the SRC for schistosomiasis researchers, with a focus on updates that have occurred since the original overview in 2008.

  8. The Counseling Center: An Undervalued Resource in Recruitment, Retention, and Risk Management

    Science.gov (United States)

    Bishop, John B.

    2010-01-01

    A primary responsibility for directors of college and university counseling centers is to explain to various audiences the multiple ways such units are of value to their institutions. This article reviews the history of how counseling center directors have been encouraged to develop and describe the work of their centers. Often overlooked are the…

  9. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE) Model of Water Resources and Water Environments

    OpenAIRE

    Guohua Fang; Ting Wang; Xinyi Si; Xin Wen; Yu Liu

    2016-01-01

    To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE) model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and out...

  10. Reducing usage of the computational resources by event driven approach to model predictive control

    Science.gov (United States)

    Misik, Stefan; Bradac, Zdenek; Cela, Arben

    2017-08-01

    This paper deals with a real-time and optimal control of dynamic systems while also considers the constraints which these systems might be subject to. Main objective of this work is to propose a simple modification of the existing Model Predictive Control approach to better suit needs of computational resource-constrained real-time systems. An example using model of a mechanical system is presented and the performance of the proposed method is evaluated in a simulated environment.

  11. Piping data bank and erection system of Angra 2: structure, computational resources and systems

    International Nuclear Information System (INIS)

    Abud, P.R.; Court, E.G.; Rosette, A.C.

    1992-01-01

    The Piping Data Bank of Angra 2 called - Erection Management System - Was developed to manage the piping erection of the Nuclear Power Plant of Angra 2. Beyond the erection follow-up of piping and supports, it manages: the piping design, the material procurement, the flow of the fabrication documents, testing of welds and material stocks at the Warehouse. The works developed in the sense of defining the structure of the Data Bank, Computational Resources and System are here described. (author)

  12. Blockchain-Empowered Fair Computational Resource Sharing System in the D2D Network

    Directory of Open Access Journals (Sweden)

    Zhen Hong

    2017-11-01

    Full Text Available Device-to-device (D2D communication is becoming an increasingly important technology in future networks with the climbing demand for local services. For instance, resource sharing in the D2D network features ubiquitous availability, flexibility, low latency and low cost. However, these features also bring along challenges when building a satisfactory resource sharing system in the D2D network. Specifically, user mobility is one of the top concerns for designing a cooperative D2D computational resource sharing system since mutual communication may not be stably available due to user mobility. A previous endeavour has demonstrated and proven how connectivity can be incorporated into cooperative task scheduling among users in the D2D network to effectively lower average task execution time. There are doubts about whether this type of task scheduling scheme, though effective, presents fairness among users. In other words, it can be unfair for users who contribute many computational resources while receiving little when in need. In this paper, we propose a novel blockchain-based credit system that can be incorporated into the connectivity-aware task scheduling scheme to enforce fairness among users in the D2D network. Users’ computational task cooperation will be recorded on the public blockchain ledger in the system as transactions, and each user’s credit balance can be easily accessible from the ledger. A supernode at the base station is responsible for scheduling cooperative computational tasks based on user mobility and user credit balance. We investigated the performance of the credit system, and simulation results showed that with a minor sacrifice of average task execution time, the level of fairness can obtain a major enhancement.

  13. Resource utilization and costs during the initial years of lung cancer screening with computed tomography in Canada.

    Science.gov (United States)

    Cressman, Sonya; Lam, Stephen; Tammemagi, Martin C; Evans, William K; Leighl, Natasha B; Regier, Dean A; Bolbocean, Corneliu; Shepherd, Frances A; Tsao, Ming-Sound; Manos, Daria; Liu, Geoffrey; Atkar-Khattra, Sukhinder; Cromwell, Ian; Johnston, Michael R; Mayo, John R; McWilliams, Annette; Couture, Christian; English, John C; Goffin, John; Hwang, David M; Puksa, Serge; Roberts, Heidi; Tremblay, Alain; MacEachern, Paul; Burrowes, Paul; Bhatia, Rick; Finley, Richard J; Goss, Glenwood D; Nicholas, Garth; Seely, Jean M; Sekhon, Harmanjatinder S; Yee, John; Amjadi, Kayvan; Cutz, Jean-Claude; Ionescu, Diana N; Yasufuku, Kazuhiro; Martel, Simon; Soghrati, Kamyar; Sin, Don D; Tan, Wan C; Urbanski, Stefan; Xu, Zhaolin; Peacock, Stuart J

    2014-10-01

    It is estimated that millions of North Americans would qualify for lung cancer screening and that billions of dollars of national health expenditures would be required to support population-based computed tomography lung cancer screening programs. The decision to implement such programs should be informed by data on resource utilization and costs. Resource utilization data were collected prospectively from 2059 participants in the Pan-Canadian Early Detection of Lung Cancer Study using low-dose computed tomography (LDCT). Participants who had 2% or greater lung cancer risk over 3 years using a risk prediction tool were recruited from seven major cities across Canada. A cost analysis was conducted from the Canadian public payer's perspective for resources that were used for the screening and treatment of lung cancer in the initial years of the study. The average per-person cost for screening individuals with LDCT was $453 (95% confidence interval [CI], $400-$505) for the initial 18-months of screening following a baseline scan. The screening costs were highly dependent on the detected lung nodule size, presence of cancer, screening intervention, and the screening center. The mean per-person cost of treating lung cancer with curative surgery was $33,344 (95% CI, $31,553-$34,935) over 2 years. This was lower than the cost of treating advanced-stage lung cancer with chemotherapy, radiotherapy, or supportive care alone, ($47,792; 95% CI, $43,254-$52,200; p = 0.061). In the Pan-Canadian study, the average cost to screen individuals with a high risk for developing lung cancer using LDCT and the average initial cost of curative intent treatment were lower than the average per-person cost of treating advanced stage lung cancer which infrequently results in a cure.

  14. Collocational Relations in Japanese Language Textbooks and Computer-Assisted Language Learning Resources

    Directory of Open Access Journals (Sweden)

    Irena SRDANOVIĆ

    2011-05-01

    Full Text Available In this paper, we explore presence of collocational relations in the computer-assisted language learning systems and other language resources for the Japanese language, on one side, and, in the Japanese language learning textbooks and wordlists, on the other side. After introducing how important it is to learn collocational relations in a foreign language, we examine their coverage in the various learners’ resources for the Japanese language. We particularly concentrate on a few collocations at the beginner’s level, where we demonstrate their treatment across various resources. A special attention is paid to what is referred to as unpredictable collocations, which have a bigger foreign language learning-burden than the predictable ones.

  15. Resource allocation on computational grids using a utility model and the knapsack problem

    CERN Document Server

    Van der ster, Daniel C; Parra-Hernandez, Rafael; Sobie, Randall J

    2009-01-01

    This work introduces a utility model (UM) for resource allocation on computational grids and formulates the allocation problem as a variant of the 0–1 multichoice multidimensional knapsack problem. The notion of task-option utility is introduced, and it is used to effect allocation policies. We present a variety of allocation policies, which are expressed as functions of metrics that are both intrinsic and external to the task and resources. An external user-defined credit-value metric is shown to allow users to intervene in the allocation of urgent or low priority tasks. The strategies are evaluated in simulation against random workloads as well as those drawn from real systems. We measure the sensitivity of the UM-derived schedules to variations in the allocation policies and their corresponding utility functions. The UM allocation strategy is shown to optimally allocate resources congruent with the chosen policies.

  16. Spectrum of tablet computer use by medical students and residents at an academic medical center

    Directory of Open Access Journals (Sweden)

    Robert Robinson

    2015-07-01

    Full Text Available Introduction. The value of tablet computer use in medical education is an area of considerable interest, with preliminary investigations showing that the majority of medical trainees feel that tablet computers added value to the curriculum. This study investigated potential differences in tablet computer use between medical students and resident physicians.Materials & Methods. Data collection for this survey was accomplished with an anonymous online questionnaire shared with the medical students and residents at Southern Illinois University School of Medicine (SIU-SOM in July and August of 2012.Results. There were 76 medical student responses (26% response rate and 66 resident/fellow responses to this survey (21% response rate. Residents/fellows were more likely to use tablet computers several times daily than medical students (32% vs. 20%, p = 0.035. The most common reported uses were for accessing medical reference applications (46%, e-Books (45%, and board study (32%. Residents were more likely than students to use a tablet computer to access an electronic medical record (41% vs. 21%, p = 0.010, review radiology images (27% vs. 12%, p = 0.019, and enter patient care orders (26% vs. 3%, p < 0.001.Discussion. This study shows a high prevalence and frequency of tablet computer use among physicians in training at this academic medical center. Most residents and students use tablet computers to access medical references, e-Books, and to study for board exams. Residents were more likely to use tablet computers to complete clinical tasks.Conclusions. Tablet computer use among medical students and resident physicians was common in this survey. All learners used tablet computers for point of care references and board study. Resident physicians were more likely to use tablet computers to access the EMR, enter patient care orders, and review radiology studies. This difference is likely due to the differing educational and professional demands placed on

  17. The Trope Tank: A Laboratory with Material Resources for Creative Computing

    Directory of Open Access Journals (Sweden)

    Nick Montfort

    2014-12-01

    Full Text Available http://dx.doi.org/10.5007/1807-9288.2014v10n2p53 Principles for organizing and making use of a laboratory with material computing resources are articulated. This laboratory, the Trope Tank, is a facility for teaching, research, and creative collaboration and offers hardware (in working condition and set up for use from the 1970s, 1980s, and 1990s, including videogame systems, home computers, and an arcade cabinet. To aid in investigating the material history of texts, the lab has a small 19th century letterpress, a typewriter, a print terminal, and dot-matrix printers. Other resources include controllers, peripherals, manuals, books, and software on physical media. These resources are used for teaching, loaned for local exhibitions and presentations, and accessed by researchers and artists. The space is primarily a laboratory (rather than a library, studio, or museum, so materials are organized by platform and intended use. Textual information about the historical contexts of the available systems, and resources are set up to allow easy operation, and even casual use, by researchers, teachers, students, and artists.

  18. Testing a computer-based ostomy care training resource for staff nurses.

    Science.gov (United States)

    Bales, Isabel

    2010-05-01

    Fragmented teaching and ostomy care provided by nonspecialized clinicians unfamiliar with state-of-the-art care and products have been identified as problems in teaching ostomy care to the new ostomate. After conducting a literature review of theories and concepts related to the impact of nurse behaviors and confidence on ostomy care, the author developed a computer-based learning resource and assessed its effect on staff nurse confidence. Of 189 staff nurses with a minimum of 1 year acute-care experience employed in the acute care, emergency, and rehabilitation departments of an acute care facility in the Midwestern US, 103 agreed to participate and returned completed pre- and post-tests, each comprising the same eight statements about providing ostomy care. F and P values were computed for differences between pre- and post test scores. Based on a scale where 1 = totally disagree and 5 = totally agree with the statement, baseline confidence and perceived mean knowledge scores averaged 3.8 and after viewing the resource program post-test mean scores averaged 4.51, a statistically significant improvement (P = 0.000). The largest difference between pre- and post test scores involved feeling confident in having the resources to learn ostomy skills independently. The availability of an electronic ostomy care resource was rated highly in both pre- and post testing. Studies to assess the effects of increased confidence and knowledge on the quality and provision of care are warranted.

  19. Initial constructs for patient-centered outcome measures to evaluate brain-computer interfaces.

    Science.gov (United States)

    Andresen, Elena M; Fried-Oken, Melanie; Peters, Betts; Patrick, Donald L

    2016-10-01

    The authors describe preliminary work toward the creation of patient-centered outcome (PCO) measures to evaluate brain-computer interface (BCI) as an assistive technology (AT) for individuals with severe speech and physical impairments (SSPI). In Phase 1, 591 items from 15 existing measures were mapped to the International Classification of Functioning, Disability and Health (ICF). In Phase 2, qualitative interviews were conducted with eight people with SSPI and seven caregivers. Resulting text data were coded in an iterative analysis. Most items (79%) were mapped to the ICF environmental domain; over half (53%) were mapped to more than one domain. The ICF framework was well suited for mapping items related to body functions and structures, but less so for items in other areas, including personal factors. Two constructs emerged from qualitative data: quality of life (QOL) and AT. Component domains and themes were identified for each. Preliminary constructs, domains and themes were generated for future PCO measures relevant to BCI. Existing instruments are sufficient for initial items but do not adequately match the values of people with SSPI and their caregivers. Field methods for interviewing people with SSPI were successful, and support the inclusion of these individuals in PCO research. Implications for Rehabilitation Adapted interview methods allow people with severe speech and physical impairments to participate in patient-centered outcomes research. Patient-centered outcome measures are needed to evaluate the clinical implementation of brain-computer interface as an assistive technology.

  20. Computer Vision Syndrome among Call Center Employees at Telecommunication Company in Bandung

    Directory of Open Access Journals (Sweden)

    Ghea Nursyifa

    2016-06-01

    Full Text Available Background: The occurrence of Computer Vision Syndrome (CVS at the workplace has increased within decades due to theprolonged use of computers. Knowledge of CVS is necessary in order to develop an awareness of how to prevent and alleviate itsprevalence . The objective of this study was to assess the knowledge of CVS among call center employees and to explore the most frequent CVS symptom experienced by the workers. Methods: A descriptive cross sectional study was conducted during the period of September to November 2014 at Telecommunication Company in Bandung using a questionnaire consisting of 30 questions. Out of the 30 questions/statements, 15 statements were about knowledge of CVS and other 15 questions were about the occurrence of CVS and its symptoms. In this study 125 call center employees participated as respondents using consecutive sampling. The level of knowledge was divided into 3 categories: good (76–100%, fair (75–56% and poor (<56%. The collected data was presented in frequency tabulation. Results: There was 74.4% of the respondents had poor knowledge of CVS. The most symptom experienced by the respondents was asthenopia. Conclusions: The CVS occurs in call center employees with various symptoms and signs. This situation is not supported by good knowledge of the syndrome which can hamper prevention programs.

  1. Learning Resources Centers and Their Effectiveness on Students’ Learning Outcomes: A Case-Study of an Omani Higher Education Institute

    Directory of Open Access Journals (Sweden)

    Peyman Nouraey

    2017-06-01

    Full Text Available The study aimed at investigating the use and effectiveness of a learning resources center, which is generally known as a library. In doing so, eight elements were investigated through an author-designed questionnaire. Each of these elements tended to delve into certain aspects of the afore-mentioned center. These elements included a students’ visits frequency, b availability of books related to modules, c center facilities, d use of discussion rooms, e use of online resources, f staff cooperation, g impact on knowledge enhancement, and, h recommendation to peers. Eighty undergraduate students participated in the study. Participants were then asked to read the statements carefully and choose one of the five responses provided, ranging from strongly agree to strongly disagree. Data were analyzed based on 5-point Likert Scale. Findings of the study revealed that participants were mostly in agreement with all eight statements provided in the questionnaire, which were interpreted as positive feedbacks from the students. Then, the frequencies of responses by the participants were reported. Finally, the results were compared and contrasted and related discussions on the effectiveness of libraries and learning resources centers on students’ learning performances and outcomes were made.

  2. Modeling Remote I/O versus Staging Tradeoff in Multi-Data Center Computing

    International Nuclear Information System (INIS)

    Suslu, Ibrahim H

    2014-01-01

    In multi-data center computing, data to be processed is not always local to the computation. This is a major challenge especially for data-intensive Cloud computing applications, since large amount of data would need to be either moved the local sites (staging) or accessed remotely over the network (remote I/O). Cloud application developers generally chose between staging and remote I/O intuitively without making any scientific comparison specific to their application data access patterns since there is no generic model available that they can use. In this paper, we propose a generic model for the Cloud application developers which would help them to choose the most appropriate data access mechanism for their specific application workloads. We define the parameters that potentially affect the end-to-end performance of the multi-data center Cloud applications which need to access large datasets over the network. To test and validate our models, we implemented a series of synthetic benchmark applications to simulate the most common data access patterns encountered in Cloud applications. We show that our model provides promising results in different settings with different parameters, such as network bandwidth, server and client capabilities, and data access ratio

  3. Computed tomography-guided core-needle biopsy of lung lesions: an oncology center experience

    Energy Technology Data Exchange (ETDEWEB)

    Guimaraes, Marcos Duarte; Fonte, Alexandre Calabria da; Chojniak, Rubens, E-mail: marcosduarte@yahoo.com.b [Hospital A.C. Camargo, Sao Paulo, SP (Brazil). Dept. of Radiology and Imaging Diagnosis; Andrade, Marcony Queiroz de [Hospital Alianca, Salvador, BA (Brazil); Gross, Jefferson Luiz [Hospital A.C. Camargo, Sao Paulo, SP (Brazil). Dept. of Chest Surgery

    2011-03-15

    Objective: The present study is aimed at describing the experience of an oncology center with computed tomography guided core-needle biopsy of pulmonary lesions. Materials and Methods: Retrospective analysis of 97 computed tomography-guided core-needle biopsy of pulmonary lesions performed in the period between 1996 and 2004 in a Brazilian reference oncology center (Hospital do Cancer - A.C. Camargo). Information regarding material appropriateness and the specific diagnoses were collected and analyzed. Results: Among 97 lung biopsies, 94 (96.9%) supplied appropriate specimens for histological analyses, with 71 (73.2%) cases being diagnosed as malignant lesions and 23 (23.7%) diagnosed as benign lesions. Specimens were inappropriate for analysis in three cases. The frequency of specific diagnosis was 83 (85.6%) cases, with high rates for both malignant lesions with 63 (88.7%) cases and benign lesions with 20 (86.7%). As regards complications, a total of 12 cases were observed as follows: 7 (7.2%) cases of hematoma, 3 (3.1%) cases of pneumothorax and 2 (2.1%) cases of hemoptysis. Conclusion: Computed tomography-guided core needle biopsy of lung lesions demonstrated high rates of material appropriateness and diagnostic specificity, and low rates of complications in the present study. (author)

  4. Final Report: Phase II Nevada Water Resources Data, Modeling, and Visualization (DMV) Center

    Energy Technology Data Exchange (ETDEWEB)

    Jackman, Thomas [Desert Research Institute; Minor, Timothy [Desert Research Institute; Pohll, Gregory [Desert Research Institute

    2013-07-22

    Water is unquestionably a critical resource throughout the United States. In the semi-arid west -- an area stressed by increase in human population and sprawl of the built environment -- water is the most important limiting resource. Crucially, science must understand factors that affect availability and distribution of water. To sustain growing consumptive demand, science needs to translate understanding into reliable and robust predictions of availability under weather conditions that could be average but might be extreme. These predictions are needed to support current and long-term planning. Similar to the role of weather forecast and climate prediction, water prediction over short and long temporal scales can contribute to resource strategy, governmental policy and municipal infrastructure decisions, which are arguably tied to the natural variability and unnatural change to climate. Change in seasonal and annual temperature, precipitation, snowmelt, and runoff affect the distribution of water over large temporal and spatial scales, which impact the risk of flooding and the groundwater recharge. Anthropogenic influences and impacts increase the complexity and urgency of the challenge. The goal of this project has been to develop a decision support framework of data acquisition, digital modeling, and 3D visualization. This integrated framework consists of tools for compiling, discovering and projecting our understanding of processes that control the availability and distribution of water. The framework is intended to support the analysis of the complex interactions between processes that affect water supply, from controlled availability to either scarcity or deluge. The developed framework enables DRI to promote excellence in water resource management, particularly within the Lake Tahoe basin. In principle, this framework could be replicated for other watersheds throughout the United States. Phase II of this project builds upon the research conducted during

  5. Optimizing qubit resources for quantum chemistry simulations in second quantization on a quantum computer

    International Nuclear Information System (INIS)

    Moll, Nikolaj; Fuhrer, Andreas; Staar, Peter; Tavernelli, Ivano

    2016-01-01

    Quantum chemistry simulations on a quantum computer suffer from the overhead needed for encoding the Fermionic problem in a system of qubits. By exploiting the block diagonality of a Fermionic Hamiltonian, we show that the number of required qubits can be reduced while the number of terms in the Hamiltonian will increase. All operations for this reduction can be performed in operator space. The scheme is conceived as a pre-computational step that would be performed prior to the actual quantum simulation. We apply this scheme to reduce the number of qubits necessary to simulate both the Hamiltonian of the two-site Fermi–Hubbard model and the hydrogen molecule. Both quantum systems can then be simulated with a two-qubit quantum computer. Despite the increase in the number of Hamiltonian terms, the scheme still remains a useful tool to reduce the dimensionality of specific quantum systems for quantum simulators with a limited number of resources. (paper)

  6. Threat and vulnerability analysis and conceptual design of countermeasures for a computer center under construction

    International Nuclear Information System (INIS)

    Rozen, A.; Musacchio, J.M.

    1988-01-01

    This project involved the assessment of a new computer center to be used as the main national data processing facility of a large European bank. This building serves as the principal facility in the country with all other branches utilizing the data processing center. As such, the building is a crucial target which may attract terrorist attacks. Threat and vulnerability assessments were performed as a basis to define and overall fully-integrated security system of passive and active countermeasures for the facility. After separately assessing the range of threats and vulnerabilities, a combined matrix of threats and vulnerabilities was used to identify the crucial combinations. A set of architectural-structural passive measures was added to the active components of the security system

  7. David Grant Medical Center energy use baseline and integrated resource assessment

    Energy Technology Data Exchange (ETDEWEB)

    Richman, E.E.; Hoshide, R.K.; Dittmer, A.L.

    1993-04-01

    The US Air Mobility Command (AMC) has tasked Pacific Northwest Laboratory (PNL) with supporting the US Department of Energy (DOE) Federal Energy Management Program`s (FEMP) mission to identify, evaluate, and assist in acquiring all cost-effective energy resource opportunities (EROs) at the David Grant Medical Center (DGMC). This report describes the methodology used to identify and evaluate the EROs at DGMC, provides a life-cycle cost (LCC) analysis for each ERO, and prioritizes any life-cycle cost-effective EROs based on their net present value (NPV), value index (VI), and savings to investment ratio (SIR or ROI). Analysis results are presented for 17 EROs that involve energy use in the areas of lighting, fan and pump motors, boiler operation, infiltration, electric load peak reduction and cogeneration, electric rate structures, and natural gas supply. Typical current energy consumption is approximately 22,900 MWh of electricity (78,300 MBtu), 87,600 kcf of natural gas (90,300 MBtu), and 8,300 gal of fuel oil (1,200 MBtu). A summary of the savings potential by energy-use category of all independent cost-effective EROs is shown in a table. This table includes the first cost, yearly energy consumption savings, and NPV for each energy-use category. The net dollar savings and NPV values as derived by the life-cycle cost analysis are based on the 1992 federal discount rate of 4.6%. The implementation of all EROs could result in a yearly electricity savings of more than 6,000 MWh or 26% of current yearly electricity consumption. More than 15 MW of billable load (total billed by the utility for a 12-month period) or more than 34% of current billed demand could also be saved. Corresponding natural gas savings would be 1,050 kcf (just over 1% of current consumption). Total yearly net energy cost savings for all options would be greater than $343,340. This value does not include any operations and maintenance (O&M) savings.

  8. David Grant Medical Center energy use baseline and integrated resource assessment

    Energy Technology Data Exchange (ETDEWEB)

    Richman, E.E.; Hoshide, R.K.; Dittmer, A.L.

    1993-04-01

    The US Air Mobility Command (AMC) has tasked Pacific Northwest Laboratory (PNL) with supporting the US Department of Energy (DOE) Federal Energy Management Program's (FEMP) mission to identify, evaluate, and assist in acquiring all cost-effective energy resource opportunities (EROs) at the David Grant Medical Center (DGMC). This report describes the methodology used to identify and evaluate the EROs at DGMC, provides a life-cycle cost (LCC) analysis for each ERO, and prioritizes any life-cycle cost-effective EROs based on their net present value (NPV), value index (VI), and savings to investment ratio (SIR or ROI). Analysis results are presented for 17 EROs that involve energy use in the areas of lighting, fan and pump motors, boiler operation, infiltration, electric load peak reduction and cogeneration, electric rate structures, and natural gas supply. Typical current energy consumption is approximately 22,900 MWh of electricity (78,300 MBtu), 87,600 kcf of natural gas (90,300 MBtu), and 8,300 gal of fuel oil (1,200 MBtu). A summary of the savings potential by energy-use category of all independent cost-effective EROs is shown in a table. This table includes the first cost, yearly energy consumption savings, and NPV for each energy-use category. The net dollar savings and NPV values as derived by the life-cycle cost analysis are based on the 1992 federal discount rate of 4.6%. The implementation of all EROs could result in a yearly electricity savings of more than 6,000 MWh or 26% of current yearly electricity consumption. More than 15 MW of billable load (total billed by the utility for a 12-month period) or more than 34% of current billed demand could also be saved. Corresponding natural gas savings would be 1,050 kcf (just over 1% of current consumption). Total yearly net energy cost savings for all options would be greater than $343,340. This value does not include any operations and maintenance (O M) savings.

  9. Abstracts of digital computer code packages assembled by the Radiation Shielding Information Center

    Energy Technology Data Exchange (ETDEWEB)

    Carter, B.J.; Maskewitz, B.F.

    1985-04-01

    This publication, ORNL/RSIC-13, Volumes I to III Revised, has resulted from an internal audit of the first 168 packages of computing technology in the Computer Codes Collection (CCC) of the Radiation Shielding Information Center (RSIC). It replaces the earlier three documents published as single volumes between 1966 to 1972. A significant number of the early code packages were considered to be obsolete and were removed from the collection in the audit process and the CCC numbers were not reassigned. Others not currently being used by the nuclear R and D community were retained in the collection to preserve technology not replaced by newer methods, or were considered of potential value for reference purposes. Much of the early technology, however, has improved through developer/RSIC/user interaction and continues at the forefront of the advancing state-of-the-art.

  10. Polymer waveguides for electro-optical integration in data centers and high-performance computers.

    Science.gov (United States)

    Dangel, Roger; Hofrichter, Jens; Horst, Folkert; Jubin, Daniel; La Porta, Antonio; Meier, Norbert; Soganci, Ibrahim Murat; Weiss, Jonas; Offrein, Bert Jan

    2015-02-23

    To satisfy the intra- and inter-system bandwidth requirements of future data centers and high-performance computers, low-cost low-power high-throughput optical interconnects will become a key enabling technology. To tightly integrate optics with the computing hardware, particularly in the context of CMOS-compatible silicon photonics, optical printed circuit boards using polymer waveguides are considered as a formidable platform. IBM Research has already demonstrated the essential silicon photonics and interconnection building blocks. A remaining challenge is electro-optical packaging, i.e., the connection of the silicon photonics chips with the system. In this paper, we present a new single-mode polymer waveguide technology and a scalable method for building the optical interface between silicon photonics chips and single-mode polymer waveguides.

  11. Abstracts of digital computer code packages assembled by the Radiation Shielding Information Center

    International Nuclear Information System (INIS)

    Carter, B.J.; Maskewitz, B.F.

    1985-04-01

    This publication, ORNL/RSIC-13, Volumes I to III Revised, has resulted from an internal audit of the first 168 packages of computing technology in the Computer Codes Collection (CCC) of the Radiation Shielding Information Center (RSIC). It replaces the earlier three documents published as single volumes between 1966 to 1972. A significant number of the early code packages were considered to be obsolete and were removed from the collection in the audit process and the CCC numbers were not reassigned. Others not currently being used by the nuclear R and D community were retained in the collection to preserve technology not replaced by newer methods, or were considered of potential value for reference purposes. Much of the early technology, however, has improved through developer/RSIC/user interaction and continues at the forefront of the advancing state-of-the-art

  12. Initial Flight Test of the Production Support Flight Control Computers at NASA Dryden Flight Research Center

    Science.gov (United States)

    Carter, John; Stephenson, Mark

    1999-01-01

    The NASA Dryden Flight Research Center has completed the initial flight test of a modified set of F/A-18 flight control computers that gives the aircraft a research control law capability. The production support flight control computers (PSFCC) provide an increased capability for flight research in the control law, handling qualities, and flight systems areas. The PSFCC feature a research flight control processor that is "piggybacked" onto the baseline F/A-18 flight control system. This research processor allows for pilot selection of research control law operation in flight. To validate flight operation, a replication of a standard F/A-18 control law was programmed into the research processor and flight-tested over a limited envelope. This paper provides a brief description of the system, summarizes the initial flight test of the PSFCC, and describes future experiments for the PSFCC.

  13. The Hardwood Tree Improvement and Regeneration Center: its strategic plans for sustaining the hardwood resource

    Science.gov (United States)

    Charles H. Michler; Michael J. Bosela; Paula M. Pijut; Keith E. Woeste

    2003-01-01

    A regional center for hardwood tree improvement, genomics, and regeneration research, development and technology transfer will focus on black walnut, black cherry, northern red oak and, in the future, on other fine hardwoods as the effort is expanded. The Hardwood Tree Improvement and Regeneration Center (HTIRC) will use molecular genetics and genomics along with...

  14. Computational fluid dynamics research at the United Technologies Research Center requiring supercomputers

    Science.gov (United States)

    Landgrebe, Anton J.

    1987-01-01

    An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.

  15. Resources to Support Ethical Practice in Evaluation: An Interview with the Director of the National Center for Research and Professional Ethics

    Science.gov (United States)

    Goodyear, Leslie

    2012-01-01

    Where do evaluators find resources on ethics and ethical practice? This article highlights a relatively new online resource, a centerpiece project of the National Center for Professional and Research Ethics (NCPRE), which brings together information on best practices in ethics in research, academia, and business in an online portal and center. It…

  16. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    Science.gov (United States)

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  17. The Erasmus Computing Grid - Building a Super-Computer Virtually for Free at the Erasmus Medical Center and the Hogeschool Rotterdam

    NARCIS (Netherlands)

    T.A. Knoch (Tobias); L.V. de Zeeuw (Luc)

    2006-01-01

    textabstractThe Set-Up of the 20 Teraflop Erasmus Computing Grid: To meet the enormous computational needs of live- science research as well as clinical diagnostics and treatment the Hogeschool Rotterdam and the Erasmus Medical Center are currently setting up one of the largest desktop

  18. Computer-aided dispatch--traffic management center field operational test final detailed test plan : WSDOT deployment

    Science.gov (United States)

    2003-10-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : WSDOT deployment". This document defines the objective, approach,...

  19. Computer-aided dispatch--traffic management center field operational test final test plans : state of Utah

    Science.gov (United States)

    2004-01-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : state of Utah". This document defines the objective, approach, an...

  20. A Safety Resource Allocation Mechanism against Connection Fault for Vehicular Cloud Computing

    Directory of Open Access Journals (Sweden)

    Tianpeng Ye

    2016-01-01

    Full Text Available The Intelligent Transportation System (ITS becomes an important component of the smart city toward safer roads, better traffic control, and on-demand service by utilizing and processing the information collected from sensors of vehicles and road side infrastructure. In ITS, Vehicular Cloud Computing (VCC is a novel technology balancing the requirement of complex services and the limited capability of on-board computers. However, the behaviors of the vehicles in VCC are dynamic, random, and complex. Thus, one of the key safety issues is the frequent disconnections between the vehicle and the Vehicular Cloud (VC when this vehicle is computing for a service. More important, the connection fault will disturb seriously the normal services of VCC and impact the safety works of the transportation. In this paper, a safety resource allocation mechanism is proposed against connection fault in VCC by using a modified workflow with prediction capability. We firstly propose the probability model for the vehicle movement which satisfies the high dynamics and real-time requirements of VCC. And then we propose a Prediction-based Reliability Maximization Algorithm (PRMA to realize the safety resource allocation for VCC. The evaluation shows that our mechanism can improve the reliability and guarantee the real-time performance of the VCC.

  1. Impact of Information Technology, Clinical Resource Constraints, and Patient-Centered Practice Characteristics on Quality of Care

    Directory of Open Access Journals (Sweden)

    JongDeuk Baek

    2015-02-01

    Full Text Available Objective: Factors in the practice environment, such as health information technology (IT infrastructure, availability of other clinical resources, and financial incentives, may influence whether practices are able to successfully implement the patient-centered medical home (PCMH model and realize its benefits. This study investigates the impacts of those PCMH-related elements on primary care physicians’ perception of quality of care. Methods: A multiple logistic regression model was estimated using the 2004 to 2005 CTS Physician Survey, a national sample of salaried primary care physicians (n = 1733. Results: The patient-centered practice environment and availability of clinical resources increased physicians’ perceived quality of care. Although IT use for clinical information access did enhance physicians’ ability to provide high quality of care, a similar positive impact of IT use was not found for e-prescribing or the exchange of clinical patient information. Lack of resources was negatively associated with physician perception of quality of care. Conclusion: Since health IT is an important foundation of PCMH, patient-centered practices are more likely to have health IT in place to support care delivery. However, despite its potential to enhance delivery of primary care, simply making health IT available does not necessarily translate into physicians’ perceptions that it enhances the quality of care they provide. It is critical for health-care managers and policy makers to ensure that primary care physicians fully recognize and embrace the use of new technology to improve both the quality of care provided and the patient outcomes.

  2. An Architecture of IoT Service Delegation and Resource Allocation Based on Collaboration between Fog and Cloud Computing

    Directory of Open Access Journals (Sweden)

    Aymen Abdullah Alsaffar

    2016-01-01

    Full Text Available Despite the wide utilization of cloud computing (e.g., services, applications, and resources, some of the services, applications, and smart devices are not able to fully benefit from this attractive cloud computing paradigm due to the following issues: (1 smart devices might be lacking in their capacity (e.g., processing, memory, storage, battery, and resource allocation, (2 they might be lacking in their network resources, and (3 the high network latency to centralized server in cloud might not be efficient for delay-sensitive application, services, and resource allocations requests. Fog computing is promising paradigm that can extend cloud resources to edge of network, solving the abovementioned issue. As a result, in this work, we propose an architecture of IoT service delegation and resource allocation based on collaboration between fog and cloud computing. We provide new algorithm that is decision rules of linearized decision tree based on three conditions (services size, completion time, and VMs capacity for managing and delegating user request in order to balance workload. Moreover, we propose algorithm to allocate resources to meet service level agreement (SLA and quality of services (QoS as well as optimizing big data distribution in fog and cloud computing. Our simulation result shows that our proposed approach can efficiently balance workload, improve resource allocation efficiently, optimize big data distribution, and show better performance than other existing methods.

  3. Report compiled by Research Center for Carbonaceous Resources, Institute for Chemical Reaction Science, Tohoku University; Tohoku Daigaku Hanno Kagaku Kenkyusho tanso shigen hanno kenkyu center hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-04-01

    The Research Center for Carbonaceous Resources was established in April 1991 for the purpose of developing a comprehensive process for converting carbonaceous resources into clean fuels or into materials equipped with advanced functions. In this report, the track records etc. of the center are introduced. Under study in the conversion process research department is the organization of a comprehensive coal conversion process which will be a combination of solvent extraction, catalytic decomposition, and catalytic gasification, whose goal is to convert coal in a clean way at high efficiency. Under study in the conversion catalyst research department are the development of a coal denitrogenation method, development of a low-temperature gasification method by use of inexpensive catalysts, synthesis of C{sub 2} hydrocarbons in a methane/carbon dioxide reaction, etc. Other endeavors under way involve the designing and development of new organic materials such as new carbon materials and a study of the foundation on which such efforts stand, that is, the study of the control of reactions between solids. Furthermore, in the study of interfacial reaction control, the contact gasification of coal, brown coal ion exchange capacity and surface conditions, carbonization of cation exchanged brown coal, etc., are being developed. (NEDO)

  4. Changing the batch system in a Tier 1 computing center: why and how

    Science.gov (United States)

    Chierici, Andrea; Dal Pra, Stefano

    2014-06-01

    At the Italian Tierl Center at CNAF we are evaluating the possibility to change the current production batch system. This activity is motivated mainly because we are looking for a more flexible licensing model as well as to avoid vendor lock-in. We performed a technology tracking exercise and among many possible solutions we chose to evaluate Grid Engine as an alternative because its adoption is increasing in the HEPiX community and because it's supported by the EMI middleware that we currently use on our computing farm. Another INFN site evaluated Slurm and we will compare our results in order to understand pros and cons of the two solutions. We will present the results of our evaluation of Grid Engine, in order to understand if it can fit the requirements of a Tier 1 center, compared to the solution we adopted long ago. We performed a survey and a critical re-evaluation of our farming infrastructure: many production softwares (accounting and monitoring on top of all) rely on our current solution and changing it required us to write new wrappers and adapt the infrastructure to the new system. We believe the results of this investigation can be very useful to other Tier-ls and Tier-2s centers in a similar situation, where the effort of switching may appear too hard to stand. We will provide guidelines in order to understand how difficult this operation can be and how long the change may take.

  5. Examining the Fundamental Obstructs of Adopting Cloud Computing for 9-1-1 Dispatch Centers in the USA

    Science.gov (United States)

    Osman, Abdulaziz

    2016-01-01

    The purpose of this research study was to examine the unknown fears of embracing cloud computing which stretches across measurements like fear of change from leaders and the complexity of the technology in 9-1-1 dispatch centers in USA. The problem that was addressed in the study was that many 9-1-1 dispatch centers in USA are still using old…

  6. Elastic Extension of a CMS Computing Centre Resources on External Clouds

    Science.gov (United States)

    Codispoti, G.; Di Maria, R.; Aiftimiei, C.; Bonacorsi, D.; Calligola, P.; Ciaschini, V.; Costantini, A.; Dal Pra, S.; DeGirolamo, D.; Grandi, C.; Michelotto, D.; Panella, M.; Peco, G.; Sapunenko, V.; Sgaravatto, M.; Taneja, S.; Zizzi, G.

    2016-10-01

    After the successful LHC data taking in Run-I and in view of the future runs, the LHC experiments are facing new challenges in the design and operation of the computing facilities. The computing infrastructure for Run-II is dimensioned to cope at most with the average amount of data recorded. The usage peaks, as already observed in Run-I, may however originate large backlogs, thus delaying the completion of the data reconstruction and ultimately the data availability for physics analysis. In order to cope with the production peaks, CMS - along the lines followed by other LHC experiments - is exploring the opportunity to access Cloud resources provided by external partners or commercial providers. Specific use cases have already been explored and successfully exploited during Long Shutdown 1 (LS1) and the first part of Run 2. In this work we present the proof of concept of the elastic extension of a CMS site, specifically the Bologna Tier-3, on an external OpenStack infrastructure. We focus on the “Cloud Bursting” of a CMS Grid site using a newly designed LSF configuration that allows the dynamic registration of new worker nodes to LSF. In this approach, the dynamically added worker nodes instantiated on the OpenStack infrastructure are transparently accessed by the LHC Grid tools and at the same time they serve as an extension of the farm for the local usage. The amount of resources allocated thus can be elastically modeled to cope up with the needs of CMS experiment and local users. Moreover, a direct access/integration of OpenStack resources to the CMS workload management system is explored. In this paper we present this approach, we report on the performances of the on-demand allocated resources, and we discuss the lessons learned and the next steps.

  7. Computed tomography evaluation of rotary systems on the root canal transportation and centering ability

    Energy Technology Data Exchange (ETDEWEB)

    Pagliosa, Andre; Raucci-Neto, Walter; Silva-Souza, Yara Teresinha Correa; Alfredo, Edson, E-mail: ysousa@unaerp.br [Universidade de Ribeirao Preto (UNAERP), SP (Brazil). Fac. de Odontologia; Sousa-Neto, Manoel Damiao; Versiani, Marco Aurelio [Universidade de Sao Paulo (USP), Ribeirao Preto, SP (Brazil). Fac. de Odoentologia

    2015-03-01

    The endodontic preparation of curved and narrow root canals is challenging, with a tendency for the prepared canal to deviate away from its natural axis. The aim of this study was to evaluate, by cone-beam computed tomography, the transportation and centering ability of curved mesiobuccal canals in maxillary molars after biomechanical preparation with different nickel-titanium (NiTi) rotary systems. Forty teeth with angles of curvature ranging from 20° to 40° and radii between 5.0 mm and 10.0 mm were selected and assigned into four groups (n = 10), according to the biomechanical preparative system used: Hero 642 (HR), Liberator (LB), ProTaper (PT), and Twisted File (TF). The specimens were inserted into an acrylic device and scanned with computed tomography prior to, and following, instrumentation at 3, 6 and 9 mm from the root apex. The canal degree of transportation and centering ability were calculated and analyzed using one-way ANOVA and Tukey’s tests (α = 0.05). The results demonstrated no significant difference (p > 0.05) in shaping ability among the rotary systems. The mean canal transportation was: -0.049 ± 0.083 mm (HR); -0.004 ± 0.044 mm (LB); -0.003 ± 0.064 mm (PT); -0.021 ± 0.064 mm (TF). The mean canal centering ability was: -0.093 ± 0.147 mm (HR); -0.001 ± 0.100 mm (LB); -0.002 ± 0.134 mm (PT); -0.033 ± 0.133 mm (TF). Also, there was no significant difference among the root segments (p > 0.05). It was concluded that the Hero 642, Liberator, ProTaper, and Twisted File rotary systems could be safely used in curved canal instrumentation, resulting in satisfactory preservation of the original canal shape. (author)

  8. Computed tomography evaluation of rotary systems on the root canal transportation and centering ability

    Directory of Open Access Journals (Sweden)

    André PAGLIOSA

    2015-01-01

    Full Text Available Abstract : The endodontic preparation of curved and narrow root canals is challenging, with a tendency for the prepared canal to deviate away from its natural axis. The aim of this study was to evaluate, by cone-beam computed tomography, the transportation and centering ability of curved mesiobuccal canals in maxillary molars after biomechanical preparation with different nickel-titanium (NiTi rotary systems. Forty teeth with angles of curvature ranging from 20° to 40° and radii between 5.0 mm and 10.0 mm were selected and assigned into four groups (n = 10, according to the biomechanical preparative system used: Hero 642 (HR, Liberator (LB, ProTaper (PT, and Twisted File (TF. The specimens were inserted into an acrylic device and scanned with computed tomography prior to, and following, instrumentation at 3, 6 and 9 mm from the root apex. The canal degree of transportation and centering ability were calculated and analyzed using one-way ANOVA and Tukey’s tests (α = 0.05. The results demonstrated no significant difference (p > 0.05 in shaping ability among the rotary systems. The mean canal transportation was: -0.049 ± 0.083 mm (HR; -0.004 ± 0.044 mm (LB; -0.003 ± 0.064 mm (PT; -0.021 ± 0.064 mm (TF. The mean canal centering ability was: -0.093 ± 0.147 mm (HR; -0.001 ± 0.100 mm (LB; -0.002 ± 0.134 mm (PT; -0.033 ± 0.133 mm (TF. Also, there was no significant difference among the root segments (p > 0.05. It was concluded that the Hero 642, Liberator, ProTaper, and Twisted File rotary systems could be safely used in curved canal instrumentation, resulting in satisfactory preservation of the original canal shape.

  9. Computed tomography evaluation of rotary systems on the root canal transportation and centering ability

    International Nuclear Information System (INIS)

    Pagliosa, Andre; Raucci-Neto, Walter; Silva-Souza, Yara Teresinha Correa; Alfredo, Edson; Sousa-Neto, Manoel Damiao; Versiani, Marco Aurelio

    2015-01-01

    The endodontic preparation of curved and narrow root canals is challenging, with a tendency for the prepared canal to deviate away from its natural axis. The aim of this study was to evaluate, by cone-beam computed tomography, the transportation and centering ability of curved mesiobuccal canals in maxillary molars after biomechanical preparation with different nickel-titanium (NiTi) rotary systems. Forty teeth with angles of curvature ranging from 20° to 40° and radii between 5.0 mm and 10.0 mm were selected and assigned into four groups (n = 10), according to the biomechanical preparative system used: Hero 642 (HR), Liberator (LB), ProTaper (PT), and Twisted File (TF). The specimens were inserted into an acrylic device and scanned with computed tomography prior to, and following, instrumentation at 3, 6 and 9 mm from the root apex. The canal degree of transportation and centering ability were calculated and analyzed using one-way ANOVA and Tukey’s tests (α = 0.05). The results demonstrated no significant difference (p > 0.05) in shaping ability among the rotary systems. The mean canal transportation was: -0.049 ± 0.083 mm (HR); -0.004 ± 0.044 mm (LB); -0.003 ± 0.064 mm (PT); -0.021 ± 0.064 mm (TF). The mean canal centering ability was: -0.093 ± 0.147 mm (HR); -0.001 ± 0.100 mm (LB); -0.002 ± 0.134 mm (PT); -0.033 ± 0.133 mm (TF). Also, there was no significant difference among the root segments (p > 0.05). It was concluded that the Hero 642, Liberator, ProTaper, and Twisted File rotary systems could be safely used in curved canal instrumentation, resulting in satisfactory preservation of the original canal shape. (author)

  10. DrugSig: A resource for computational drug repositioning utilizing gene expression signatures.

    Directory of Open Access Journals (Sweden)

    Hongyu Wu

    Full Text Available Computational drug repositioning has been proved as an effective approach to develop new drug uses. However, currently existing strategies strongly rely on drug response gene signatures which scattered in separated or individual experimental data, and resulted in low efficient outputs. So, a fully drug response gene signatures database will be very helpful to these methods. We collected drug response microarray data and annotated related drug and targets information from public databases and scientific literature. By selecting top 500 up-regulated and down-regulated genes as drug signatures, we manually established the DrugSig database. Currently DrugSig contains more than 1300 drugs, 7000 microarray and 800 targets. Moreover, we developed the signature based and target based functions to aid drug repositioning. The constructed database can serve as a resource to quicken computational drug repositioning. Database URL: http://biotechlab.fudan.edu.cn/database/drugsig/.

  11. Exploiting short-term memory in soft body dynamics as a computational resource.

    Science.gov (United States)

    Nakajima, K; Li, T; Hauser, H; Pfeifer, R

    2014-11-06

    Soft materials are not only highly deformable, but they also possess rich and diverse body dynamics. Soft body dynamics exhibit a variety of properties, including nonlinearity, elasticity and potentially infinitely many degrees of freedom. Here, we demonstrate that such soft body dynamics can be employed to conduct certain types of computation. Using body dynamics generated from a soft silicone arm, we show that they can be exploited to emulate functions that require memory and to embed robust closed-loop control into the arm. Our results suggest that soft body dynamics have a short-term memory and can serve as a computational resource. This finding paves the way towards exploiting passive body dynamics for control of a large class of underactuated systems. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  12. The Usage of Informal Computer Based Communication in the Context of Organization’s Technological Resources

    Directory of Open Access Journals (Sweden)

    Steponas Jonušauskas

    2011-12-01

    Full Text Available Purpose of the article is theoretically and practically analyze the features of informal computer based communication in the context of organization’s technological resources.Methodology—meta analysis, survey and descriptive analysis.Findings. According to scientists, the functions of informal communication cover sharing of work related information, coordination of team activities, spread of organizational culture and feeling of interdependence and affinity. Also, informal communication widens the individuals’ recognition of reality, creates general context of environment between talkers, and strengthens interpersonal attraction. For these reasons, informal communication is desirable and even necessary in organizations because it helps to ensure efficient functioning of the enterprise. However, communicating through electronic channels suppresses informal connections or addresses them to the outside of the organization. So, electronic communication is not beneficial for developing ties in informal organizational network. The empirical research showed, thatsignificant part of courts administration staff is prone to use technological resources of their office for informal communication. Representatives of courts administration choose friends for computer based communication much more often than colleagues (72% and 63%respectively. 93%of the research respondents use an additional e-mail box serviced by commercial providers for non work communication. High intensity of informal electronic communication with friends and familiars shows that workers of court administration are used to meet their psycho emotional needs outside the work place. The survey confirmed conclusion of the theoretical analysis: computer based communication is not beneficial for developing informal contacts between workers. In order for the informal communication could carry out its functions and technological recourses of organization would be used effectively, staff

  13. The Usage of Informal Computer Based Communication in the Context of Organization’s Technological Resources

    Directory of Open Access Journals (Sweden)

    Agota Giedrė Raišienė

    2013-08-01

    Full Text Available Purpose of the article is theoretically and practically analyze the features of informal computer based communication in the context of organization’s technological resources. Methodology—meta analysis, survey and descriptive analysis. Findings. According to scientists, the functions of informal communication cover sharing of work related information, coordination of team activities, spread of organizational culture and feeling of interdependence and affinity. Also, informal communication widens the individuals’ recognition of reality, creates general context of environment between talkers, and strengthens interpersonal attraction. For these reasons, informal communication is desirable and even necessary in organizations because it helps to ensure efficient functioning of the enterprise. However, communicating through electronic channels suppresses informal connections or addresses them to the outside of the organization. So, electronic communication is not beneficial for developing ties in informal organizational network. The empirical research showed, that significant part of courts administration staff is prone to use technological resources of their office for informal communication. Representatives of courts administration choose friends for computer based communication much more often than colleagues (72% and 63%respectively. 93%of the research respondents use an additional e-mail box serviced by commercial providers for non work communication. High intensity of informal electronic communication with friends and familiars shows that workers of court administration are used to meet their psycho emotional needs outside the work place. The survey confirmed conclusion of the theoretical analysis: computer based communication is not beneficial for developing informal contacts between workers. In order for the informal communication could carry out its functions and technological recourses of organization would be used effectively, staff

  14. Pain, Work-related Characteristics, and Psychosocial Factors among Computer Workers at a University Center.

    Science.gov (United States)

    Mainenti, Míriam Raquel Meira; Felicio, Lilian Ramiro; Rodrigues, Erika de Carvalho; Ribeiro da Silva, Dalila Terrinha; Vigário Dos Santos, Patrícia

    2014-04-01

    [Purpose] Complaint of pain is common in computer workers, encouraging the investigation of pain-related workplace factors. This study investigated the relationship among work-related characteristics, psychosocial factors, and pain among computer workers from a university center. [Subjects and Methods] Fifteen subjects (median age, 32.0 years; interquartile range, 26.8-34.5 years) were subjected to measurement of bioelectrical impedance; photogrammetry; workplace measurements; and pain complaint, quality of life, and motivation questionnaires. [Results] The low back was the most prevalent region of complaint (76.9%). The number of body regions for which subjects complained of pain was greater in the no rest breaks group, which also presented higher prevalences of neck (62.5%) and low back (100%) pain. There were also observed associations between neck complaint and quality of life; neck complaint and head protrusion; wrist complaint and shoulder angle; and use of a chair back and thoracic pain. [Conclusion] Complaint of pain was associated with no short rest breaks, no use of a chair back, poor quality of life, high head protrusion, and shoulder angle while using the mouse of a computer.

  15. Concurrent validity of an automated algorithm for computing the center of pressure excursion index (CPEI).

    Science.gov (United States)

    Diaz, Michelle A; Gibbons, Mandi W; Song, Jinsup; Hillstrom, Howard J; Choe, Kersti H; Pasquale, Maria R

    2018-01-01

    Center of Pressure Excursion Index (CPEI), a parameter computed from the distribution of plantar pressures during stance phase of barefoot walking, has been used to assess dynamic foot function. The original custom program developed to calculate CPEI required the oversight of a user who could manually correct for certain exceptions to the computational rules. A new fully automatic program has been developed to calculate CPEI with an algorithm that accounts for these exceptions. The purpose of this paper is to compare resulting CPEI values computed by these two programs on plantar pressure data from both asymptomatic and pathologic subjects. If comparable, the new program offers significant benefits-reduced potential for variability due to rater discretion and faster CPEI calculation. CPEI values were calculated from barefoot plantar pressure distributions during comfortable paced walking on 61 healthy asymptomatic adults, 19 diabetic adults with moderate hallux valgus, and 13 adults with mild hallux valgus. Right foot data for each subject was analyzed with linear regression and a Bland-Altman plot. The automated algorithm yielded CPEI values that were linearly related to the original program (R 2 =0.99; Pcomputation methods. Results of this analysis suggest that the new automated algorithm may be used to calculate CPEI on both healthy and pathologic feet. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. New developments in delivering public access to data from the National Center for Computational Toxicology at the EPA

    Science.gov (United States)

    Researchers at EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The goal of this researc...

  17. Computer modelling of the UK wind energy resource. Phase 2. Application of the methodology

    Energy Technology Data Exchange (ETDEWEB)

    Burch, S F; Makari, M; Newton, K; Ravenscroft, F; Whittaker, J

    1993-12-31

    This report presents the results of the second phase of a programme to estimate the UK wind energy resource. The overall objective of the programme is to provide quantitative resource estimates using a mesoscale (resolution about 1km) numerical model for the prediction of wind flow over complex terrain, in conjunction with digitised terrain data and wind data from surface meteorological stations. A network of suitable meteorological stations has been established and long term wind data obtained. Digitised terrain data for the whole UK were obtained, and wind flow modelling using the NOABL computer program has been performed. Maps of extractable wind power have been derived for various assumptions about wind turbine characteristics. Validation of the methodology indicates that the results are internally consistent, and in good agreement with available comparison data. Existing isovent maps, based on standard meteorological data which take no account of terrain effects, indicate that 10m annual mean wind speeds vary between about 4.5 and 7 m/s over the UK with only a few coastal areas over 6 m/s. The present study indicates that 28% of the UK land area had speeds over 6 m/s, with many hill sites having 10m speeds over 10 m/s. It is concluded that these `first order` resource estimates represent a substantial improvement over the presently available `zero order` estimates. The results will be useful for broad resource studies and initial site screening. Detailed resource evaluation for local sites will require more detailed local modelling or ideally long term field measurements. (12 figures, 14 tables, 21 references). (Author)

  18. Fox Chase Cancer Center's Genitourinary Division: a national resource for research, innovation and patient care.

    Science.gov (United States)

    Uzzo, Robert G; Horwitz, Eric M; Plimack, Elizabeth R

    2016-04-01

    Founded in 1904, Fox Chase Cancer Center remains committed to its mission. It is one of 41 centers in the country designated as a Comprehensive Cancer Center by the National Cancer Institute, is a founding member of the National Comprehensive Cancer Network, holds the magnet designation for nursing excellence, is one of the first to establish a family cancer risk assessment program, and has achieved national distinction because of the scientific discoveries made there that have advanced clinical care. Two of its researchers have won Nobel prizes. The Genitourinary Division is nationally recognized and viewed as one of the top driving forces behind the growth of Fox Chase due to its commitment to initiating and participating in clinical trials, its prolific contributions to peer-reviewed publications and presentations at scientific meetings, its innovations in therapies and treatment strategies, and its commitment to bringing cutting-edge therapies to patients.

  19. Assessment of Outreach by a Regional Burn Center: Could Referral Criteria Revision Help with Utilization of Resources?

    Science.gov (United States)

    Carter, Nicholas H; Leonard, Clint; Rae, Lisa

    2018-02-20

    The objectives of this study were to identify trends in preburn center care, assess needs for outreach and education efforts, and evaluate resource utilization with regard to referral criteria. We hypothesized that many transferred patients were discharged home after brief hospitalizations and without need for operation. Retrospective chart review was performed for all adult and pediatric transfers to our regional burn center from July 2012 to July 2014. Details of initial management including TBSA estimation, fluid resuscitation, and intubation status were recorded. Mode of transport, burn center length of stay, need for operation, and in-hospital mortality were analyzed. In two years, our burn center received 1004 referrals from other hospitals including 713 inpatient transfers. Within this group, 621 were included in the study. Among transferred patients, 476 (77%) had burns less than 10% TBSA, 69 (11%) had burns between 10-20% TBSA, and 76 (12%) had burns greater than 20% TBSA. Referring providers did not document TBSA for 261 (42%) of patients. Among patients with less than 10% TBSA burns, 196 (41%) received fluid boluses. Among patients with TBSA < 10%, 196 (41%) were sent home from the emergency department or discharged within 24 hours, and an additional 144 (30%) were discharged within 48 hours. Overall, 187 (30%) patients required an operation. In-hospital mortality rates were 1.5% for patients who arrived by ground transport, 14.9% for rotor wing transport, and 18.2% for fixed wing transport. Future education efforts should emphasize the importance of calculating TBSA to guide need for fluid resuscitation and restricting fluid boluses to patients that are hypotensive. Clarifying the American Burn Association burn center referral criteria to distinguish between immediate transfer vs outpatient referral may improve patient care and resource utilization.

  20. Vanderbilt University Institute of Imaging Science Center for Computational Imaging XNAT: A multimodal data archive and processing environment.

    Science.gov (United States)

    Harrigan, Robert L; Yvernault, Benjamin C; Boyd, Brian D; Damon, Stephen M; Gibney, Kyla David; Conrad, Benjamin N; Phillips, Nicholas S; Rogers, Baxter P; Gao, Yurui; Landman, Bennett A

    2016-01-01

    The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and REDCap to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. A Resource Service Model in the Industrial IoT System Based on Transparent Computing.

    Science.gov (United States)

    Li, Weimin; Wang, Bin; Sheng, Jinfang; Dong, Ke; Li, Zitong; Hu, Yixiang

    2018-03-26

    The Internet of Things (IoT) has received a lot of attention, especially in industrial scenarios. One of the typical applications is the intelligent mine, which actually constructs the Six-Hedge underground systems with IoT platforms. Based on a case study of the Six Systems in the underground metal mine, this paper summarizes the main challenges of industrial IoT from the aspects of heterogeneity in devices and resources, security, reliability, deployment and maintenance costs. Then, a novel resource service model for the industrial IoT applications based on Transparent Computing (TC) is presented, which supports centralized management of all resources including operating system (OS), programs and data on the server-side for the IoT devices, thus offering an effective, reliable, secure and cross-OS IoT service and reducing the costs of IoT system deployment and maintenance. The model has five layers: sensing layer, aggregation layer, network layer, service and storage layer and interface and management layer. We also present a detailed analysis on the system architecture and key technologies of the model. Finally, the efficiency of the model is shown by an experiment prototype system.

  2. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc

    2016-06-20

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  3. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc; Vitriolo, Alessandro; Adamo, Antonio; Laise, Pasquale; Das, Vivek; Testa, Giuseppe

    2016-01-01

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  4. Monitoring of Computing Resource Use of Active Software Releases in ATLAS

    CERN Document Server

    Limosani, Antonio; The ATLAS collaboration

    2016-01-01

    The LHC is the world's most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the Tier0 at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as "MemoryMonitor", to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and displayed...

  5. Monitoring of computing resource use of active software releases at ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00219183; The ATLAS collaboration

    2017-01-01

    The LHC is the world’s most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the TierO at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end-user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as “MemoryMonitor”, to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and dis...

  6. 76 FR 6627 - National Center for Research Resources; Notice of Closed Meeting

    Science.gov (United States)

    2011-02-07

    ... U.S.C., as amended. The contract proposals and the discussions could disclose confidential trade... concerning individuals associated with the contract proposals, the disclosure of which would constitute a... Resources Special Emphasis Panel; SBIR Contract Review. Date: March 16, 2011. Time: 8 a.m. to 5 p.m. Agenda...

  7. Amarillo National Resource Center for Plutonium. Quarterly technical progress report, May 1--July 31, 1998

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-09-01

    Progress is reported on research projects related to the following: Electronic resource library; Environment, safety, and health; Communication, education, training, and community involvement; Nuclear and other materials; and Reporting, evaluation, monitoring, and administration. Technical studies investigate remedial action of high explosives-contaminated lands, radioactive waste management, nondestructive assay methods, and plutonium processing, handling, and storage.

  8. The culture collection and herbarium of the Center for Forest Mycology Research: A national resource

    Science.gov (United States)

    J.A. Glaeser; K.K. Nakasone; D.J. Lodge; B. Ortiz-Santana; D.L. Lindner

    2013-01-01

    The Center for Forest Mycology Research (CFMR), U.S. Forest Service, Northern Research Station, Madison, WI, is home to the world's largest collection of wood-inhabiting fungi. These collections constitute a library of the fungal kingdom that is used by researchers thoughout the world. The CFMR collections have many practical uses that have improved the lives of...

  9. Department of Music honors women's month with benefit concert for the Women's Resource Center

    OpenAIRE

    Adams, Louise

    2008-01-01

    The Virginia Tech Department of Music presents guest pianist Lise Keiter-Brotzman in "A Tribute to Women Composers" on Wednesday, March 19 at 8 p.m., in the Squires Recital Salon located in the Squires Student Center on College Avneue adjacent to downtown Blacksburg.

  10. Family Literacy Project. Learning Centers for Parents and Children. A Resource Guide.

    Science.gov (United States)

    Crocker, M. Judith, Ed.; And Others

    This guide is intended to help adult education programs establish family literacy programs and create Family Learning Centers in Cleveland Public Schools. The information should assist program coordinators in developing educational components that offer activities to raise the self-esteem of the parents and provide them with the knowledge and…

  11. Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center

    Science.gov (United States)

    Molthan, Andrew L.; Limaye, Ashutosh S.; Srikishen, Jayanthi

    2011-01-01

    Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula s "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA s National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA s SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT s experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by

  12. Computer-modeling codes to improve exploration nuclear-logging methods. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Wilson, R.D.; Price, R.K.; Kosanke, K.L.

    1983-03-01

    As part of the Department of Energy's National Uranium Resource Evaluation (NURE) project's Technology Development effort, a number of computer codes and accompanying data bases were assembled for use in modeling responses of nuclear borehole logging Sondes. The logging methods include fission neutron, active and passive gamma-ray, and gamma-gamma. These CDC-compatible computer codes and data bases are available on magnetic tape from the DOE Technical Library at its Grand Junction Area Office. Some of the computer codes are standard radiation-transport programs that have been available to the radiation shielding community for several years. Other codes were specifically written to model the response of borehole radiation detectors or are specialized borehole modeling versions of existing Monte Carlo transport programs. Results from several radiation modeling studies are available as two large data bases (neutron and gamma-ray). These data bases are accompanied by appropriate processing programs that permit the user to model a wide range of borehole and formation-parameter combinations for fission-neutron, neutron-, activation and gamma-gamma logs. The first part of this report consists of a brief abstract for each code or data base. The abstract gives the code name and title, short description, auxiliary requirements, typical running time (CDC 6600), and a list of references. The next section gives format specifications and/or directory for the tapes. The final section of the report presents listings for programs used to convert data bases between machine floating-point and EBCDIC

  13. Integral Images: Efficient Algorithms for Their Computation and Storage in Resource-Constrained Embedded Vision Systems.

    Science.gov (United States)

    Ehsan, Shoaib; Clark, Adrian F; Naveed ur Rehman; McDonald-Maier, Klaus D

    2015-07-10

    The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF), allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video). Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44%) in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems.

  14. Integral Images: Efficient Algorithms for Their Computation and Storage in Resource-Constrained Embedded Vision Systems

    Directory of Open Access Journals (Sweden)

    Shoaib Ehsan

    2015-07-01

    Full Text Available The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF, allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video. Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44% in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems.

  15. Geological characteristics and resource potentials of oil shale in Ordos Basin, Center China

    Energy Technology Data Exchange (ETDEWEB)

    Yunlai, Bai; Yingcheng, Zhao; Long, Ma; Wu-jun, Wu; Yu-hu, Ma

    2010-09-15

    It has been shown that not only there are abundant oil, gas, coal, coal-bed gas, groundwater and giant uranium deposits but also there are abundant oil shale resources in Ordos basin. It has been shown also that the thickness of oil shale is, usually, 4-36m, oil-bearing 1.5%-13.7%, caloric value 1.66-20.98MJ/kg. The resource amount of oil shale with burial depth less than 2000 m is over 2000x108t (334). Within it, confirmed reserve is about 1x108t (121). Not only huge economic benefit but also precious experience in developing oil shale may be obtained in Ordos basin.

  16. Center for Fetal Monkey Gene Transfer for Heart, Lung, and Blood Diseases: An NHLBI Resource for the Gene Therapy Community

    Science.gov (United States)

    Skarlatos, Sonia I.

    2012-01-01

    Abstract The goals of the National Heart, Lung, and Blood Institute (NHLBI) Center for Fetal Monkey Gene Transfer for Heart, Lung, and Blood Diseases are to conduct gene transfer studies in monkeys to evaluate safety and efficiency; and to provide NHLBI-supported investigators with expertise, resources, and services to actively pursue gene transfer approaches in monkeys in their research programs. NHLBI-supported projects span investigators throughout the United States and have addressed novel approaches to gene delivery; “proof-of-principle”; assessed whether findings in small-animal models could be demonstrated in a primate species; or were conducted to enable new grant or IND submissions. The Center for Fetal Monkey Gene Transfer for Heart, Lung, and Blood Diseases successfully aids the gene therapy community in addressing regulatory barriers, and serves as an effective vehicle for advancing the field. PMID:22974119

  17. CoryneCenter – An online resource for the integrated analysis of corynebacterial genome and transcriptome data

    Directory of Open Access Journals (Sweden)

    Hüser Andrea T

    2007-11-01

    Full Text Available Abstract Background The introduction of high-throughput genome sequencing and post-genome analysis technologies, e.g. DNA microarray approaches, has created the potential to unravel and scrutinize complex gene-regulatory networks on a large scale. The discovery of transcriptional regulatory interactions has become a major topic in modern functional genomics. Results To facilitate the analysis of gene-regulatory networks, we have developed CoryneCenter, a web-based resource for the systematic integration and analysis of genome, transcriptome, and gene regulatory information for prokaryotes, especially corynebacteria. For this purpose, we extended and combined the following systems into a common platform: (1 GenDB, an open source genome annotation system, (2 EMMA, a MAGE compliant application for high-throughput transcriptome data storage and analysis, and (3 CoryneRegNet, an ontology-based data warehouse designed to facilitate the reconstruction and analysis of gene regulatory interactions. We demonstrate the potential of CoryneCenter by means of an application example. Using microarray hybridization data, we compare the gene expression of Corynebacterium glutamicum under acetate and glucose feeding conditions: Known regulatory networks are confirmed, but moreover CoryneCenter points out additional regulatory interactions. Conclusion CoryneCenter provides more than the sum of its parts. Its novel analysis and visualization features significantly simplify the process of obtaining new biological insights into complex regulatory systems. Although the platform currently focusses on corynebacteria, the integrated tools are by no means restricted to these species, and the presented approach offers a general strategy for the analysis and verification of gene regulatory networks. CoryneCenter provides freely accessible projects with the underlying genome annotation, gene expression, and gene regulation data. The system is publicly available at http://www.CoryneCenter.de.

  18. The Sharjah Center for Astronomy and Space Sciences (SCASS 2015): Concept and Resources

    Science.gov (United States)

    Naimiy, Hamid M. K. Al

    2015-08-01

    The Sharjah Center for Astronomy and Space Sciences (SCASS) was launched this year 2015 at the University of Sharjah in the UAE. The center will serve to enrich research in the fields of astronomy and space sciences, promote these fields at all educational levels, and encourage community involvement in these sciences. SCASS consists of:The Planetarium: Contains a semi-circle display screen (18 meters in diameter) installed at an angle of 10° which displays high-definition images using an advanced digital display system consisting of seven (7) high-performance light-display channels. The Planetarium Theatre offers a 200-seat capacity with seats placed at highly calculated angles. The Planetarium also contains an enormous star display (Star Ball - 10 million stars) located in the heart of the celestial dome theatre.The Sharjah Astronomy Observatory: A small optical observatory consisting of a reflector telescope 45 centimeters in diameter to observe the galaxies, stars and planets. Connected to it is a refractor telescope of 20 centimeters in diameter to observe the sun and moon with highly developed astronomical devices, including a digital camera (CCD) and a high-resolution Echelle Spectrograph with auto-giving and remote calibration ports.Astronomy, space and physics educational displays for various age groups include:An advanced space display that allows for viewing the universe during four (4) different time periods as seen by:1) The naked eye; 2) Galileo; 3) Spectrographic technology; and 4) The space technology of today.A space technology display that includes space discoveries since the launching of the first satellite in 1940s until now.The Design Concept for the Center (450,000 sq. meters) was originated by HH Sheikh Sultan bin Mohammed Al Qasimi, Ruler of Sharjah, and depicts the dome as representing the sun in the middle of the center surrounded by planetary bodies in orbit to form the solar system as seen in the sky.

  19. Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center

    Science.gov (United States)

    Molthan, A.; Limaye, A. S.

    2011-12-01

    Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula's "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA's National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA's SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT's experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by

  20. Interactive Whiteboards and Computer Games at Highschool Level: Digital Resources for Enhancing Reflection in Teaching and Learning

    DEFF Research Database (Denmark)

    Sorensen, Elsebeth Korsgaard; Poulsen, Mathias; Houmann, Rita

    The general potential of computer games for teaching and learning is becoming widely recognized. In particular, within the application contexts of primary and lower secondary education, the relevance and value and computer games seem more accepted, and the possibility and willingness to incorporate...... computer games as a possible resource at the level of other educational resources seem more frequent. For some reason, however, to apply computer games in processes of teaching and learning at the high school level, seems an almost non-existent event. This paper reports on study of incorporating...... the learning game “Global Conflicts: Latin America” as a resource into the teaching and learning of a course involving the two subjects “English language learning” and “Social studies” at the final year in a Danish high school. The study adapts an explorative research design approach and investigates...

  1. A hypothesis on the formation of the primary ossification centers in the membranous neurocranium: a mathematical and computational model.

    Science.gov (United States)

    Garzón-Alvarado, Diego A

    2013-01-21

    This article develops a model of the appearance and location of the primary centers of ossification in the calvaria. The model uses a system of reaction-diffusion equations of two molecules (BMP and Noggin) whose behavior is of type activator-substrate and its solution produces Turing patterns, which represents the primary ossification centers. Additionally, the model includes the level of cell maturation as a function of the location of mesenchymal cells. Thus the mature cells can become osteoblasts due to the action of BMP2. Therefore, with this model, we can have two frontal primary centers, two parietal, and one, two or more occipital centers. The location of these centers in the simplified computational model is highly consistent with those centers found at an embryonic level. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Performance evaluation of multi-stratum resources integration based on network function virtualization in software defined elastic data center optical interconnect.

    Science.gov (United States)

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; Tian, Rui; Han, Jianrui; Lee, Young

    2015-11-30

    Data center interconnect with elastic optical network is a promising scenario to meet the high burstiness and high-bandwidth requirements of data center services. In our previous work, we implemented multi-stratum resilience between IP and elastic optical networks that allows to accommodate data center services. In view of this, this study extends to consider the resource integration by breaking the limit of network device, which can enhance the resource utilization. We propose a novel multi-stratum resources integration (MSRI) architecture based on network function virtualization in software defined elastic data center optical interconnect. A resource integrated mapping (RIM) scheme for MSRI is introduced in the proposed architecture. The MSRI can accommodate the data center services with resources integration when the single function or resource is relatively scarce to provision the services, and enhance globally integrated optimization of optical network and application resources. The overall feasibility and efficiency of the proposed architecture are experimentally verified on the control plane of OpenFlow-based enhanced software defined networking (eSDN) testbed. The performance of RIM scheme under heavy traffic load scenario is also quantitatively evaluated based on MSRI architecture in terms of path blocking probability, provisioning latency and resource utilization, compared with other provisioning schemes.

  3. Using multiple metaphors and multimodalities as a semiotic resource when teaching year 2 students computational strategies

    Science.gov (United States)

    Mildenhall, Paula; Sherriff, Barbara

    2017-06-01

    Recent research indicates that using multimodal learning experiences can be effective in teaching mathematics. Using a social semiotic lens within a participationist framework, this paper reports on a professional learning collaboration with a primary school teacher designed to explore the use of metaphors and modalities in mathematics instruction. This video case study was conducted in a year 2 classroom over two terms, with the focus on building children's understanding of computational strategies. The findings revealed that the teacher was able to successfully plan both multimodal and multiple metaphor learning experiences that acted as semiotic resources to support the children's understanding of abstract mathematics. The study also led to implications for teaching when using multiple metaphors and multimodalities.

  4. Adaptive resource allocation scheme using sliding window subchannel gain computation: context of OFDMA wireless mobiles systems

    International Nuclear Information System (INIS)

    Khelifa, F.; Samet, A.; Ben Hassen, W.; Afif, M.

    2011-01-01

    Multiuser diversity combined with Orthogonal Frequency Division Multiple Access (OFDMA) are a promising technique for achieving high downlink capacities in new generation of cellular and wireless network systems. The total capacity of OFDMA based-system is maximized when each subchannel is assigned to the mobile station with the best channel to noise ratio for that subchannel with power is uniformly distributed between all subchannels. A contiguous method for subchannel construction is adopted in IEEE 802.16 m standard in order to reduce OFDMA system complexity. In this context, new subchannel gain computation method, can contribute, jointly with optimal assignment subchannel to maximize total system capacity. In this paper, two new methods have been proposed in order to achieve a better trade-off between fairness and efficiency use of resources. Numerical results show that proposed algorithms provide low complexity, higher total system capacity and fairness among users compared to others recent methods.

  5. Application of a Resource Theory for Magic States to Fault-Tolerant Quantum Computing.

    Science.gov (United States)

    Howard, Mark; Campbell, Earl

    2017-03-03

    Motivated by their necessity for most fault-tolerant quantum computation schemes, we formulate a resource theory for magic states. First, we show that robustness of magic is a well-behaved magic monotone that operationally quantifies the classical simulation overhead for a Gottesman-Knill-type scheme using ancillary magic states. Our framework subsequently finds immediate application in the task of synthesizing non-Clifford gates using magic states. When magic states are interspersed with Clifford gates, Pauli measurements, and stabilizer ancillas-the most general synthesis scenario-then the class of synthesizable unitaries is hard to characterize. Our techniques can place nontrivial lower bounds on the number of magic states required for implementing a given target unitary. Guided by these results, we have found new and optimal examples of such synthesis.

  6. Amarillo National Resource Center for plutonium. Work plan progress report, November 1, 1995--January 31, 1996

    Energy Technology Data Exchange (ETDEWEB)

    Cluff, D. [Texas Tech Univ., Lubbock, TX (United States)

    1996-04-01

    The Center operates under a cooperative agreement between DOE and the State of Texas and is directed and administered by an education consortium. Its programs include developing peaceful uses for the materials removed from dismantled weapons, studying effects of nuclear materials on environment and public health, remedying contaminated soils and water, studying storage, disposition, and transport of Pu, HE, and other hazardous materials removed from weapons, providing research and counsel to US in carrying out weapons reductions in cooperation with Russia, and conducting a variety of education and training programs.

  7. Radiotherapy infrastructure and human resources in Switzerland : Present status and projected computations for 2020.

    Science.gov (United States)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar; Zwahlen, Daniel; Bodis, Stephan

    2016-09-01

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology "Quantification of Radiation Therapy Infrastructure and Staffing" guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO "Health Economics in Radiation Oncology" (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland.

  8. Radiotherapy infrastructure and human resources in Switzerland. Present status and projected computations for 2020

    International Nuclear Information System (INIS)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar; Zwahlen, Daniel; Bodis, Stephan

    2016-01-01

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology ''Quantification of Radiation Therapy Infrastructure and Staffing'' guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO ''Health Economics in Radiation Oncology'' (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland. (orig.) [de

  9. EGI-EUDAT integration activity - Pair data and high-throughput computing resources together

    Science.gov (United States)

    Scardaci, Diego; Viljoen, Matthew; Vitlacil, Dejan; Fiameni, Giuseppe; Chen, Yin; sipos, Gergely; Ferrari, Tiziana

    2016-04-01

    EGI (www.egi.eu) is a publicly funded e-infrastructure put together to give scientists access to more than 530,000 logical CPUs, 200 PB of disk capacity and 300 PB of tape storage to drive research and innovation in Europe. The infrastructure provides both high throughput computing and cloud compute/storage capabilities. Resources are provided by about 350 resource centres which are distributed across 56 countries in Europe, the Asia-Pacific region, Canada and Latin America. EUDAT (www.eudat.eu) is a collaborative Pan-European infrastructure providing research data services, training and consultancy for researchers, research communities, research infrastructures and data centres. EUDAT's vision is to enable European researchers and practitioners from any research discipline to preserve, find, access, and process data in a trusted environment, as part of a Collaborative Data Infrastructure (CDI) conceived as a network of collaborating, cooperating centres, combining the richness of numerous community-specific data repositories with the permanence and persistence of some of Europe's largest scientific data centres. EGI and EUDAT, in the context of their flagship projects, EGI-Engage and EUDAT2020, started in March 2015 a collaboration to harmonise the two infrastructures, including technical interoperability, authentication, authorisation and identity management, policy and operations. The main objective of this work is to provide end-users with a seamless access to an integrated infrastructure offering both EGI and EUDAT services and, then, pairing data and high-throughput computing resources together. To define the roadmap of this collaboration, EGI and EUDAT selected a set of relevant user communities, already collaborating with both infrastructures, which could bring requirements and help to assign the right priorities to each of them. In this way, from the beginning, this activity has been really driven by the end users. The identified user communities are

  10. Establishment of the South-Eastern Norway Regional Health Authority Resource Center for Children with Prenatal Alcohol/Drug Exposure

    Directory of Open Access Journals (Sweden)

    Gro C. C. Løhaugen

    2015-01-01

    Full Text Available This paper presents a new initiative in the South-Eastern Health Region of Norway to establish a regional resource center focusing on services for children and adolescents aged 2–18 years with prenatal exposure to alcohol or other drugs. In Norway, the prevalence of fetal alcohol spectrum (FAS is not known but has been estimated to be between 1 and 2 children per 1000 births, while the prevalence of prenatal exposure to illicit drugs is unknown. The resource center is the first of its kind in Scandinavia and will have three main objectives: (1 provide hospital staff, community health and child welfare personnel, and special educators with information, educational courses, and seminars focused on the identification, diagnosis, and treatment of children with a history of prenatal alcohol/drug exposure; (2 provide specialized health services, such as diagnostic services and intervention planning, for children referred from hospitals in the South-Eastern Health Region of Norway; and (3 initiate multicenter studies focusing on the diagnostic process and evaluation of interventions.

  11. Efficient Nash Equilibrium Resource Allocation Based on Game Theory Mechanism in Cloud Computing by Using Auction.

    Science.gov (United States)

    Nezarat, Amin; Dastghaibifard, G H

    2015-01-01

    One of the most complex issues in the cloud computing environment is the problem of resource allocation so that, on one hand, the cloud provider expects the most profitability and, on the other hand, users also expect to have the best resources at their disposal considering the budget constraints and time. In most previous work conducted, heuristic and evolutionary approaches have been used to solve this problem. Nevertheless, since the nature of this environment is based on economic methods, using such methods can decrease response time and reducing the complexity of the problem. In this paper, an auction-based method is proposed which determines the auction winner by applying game theory mechanism and holding a repetitive game with incomplete information in a non-cooperative environment. In this method, users calculate suitable price bid with their objective function during several round and repetitions and send it to the auctioneer; and the auctioneer chooses the winning player based the suggested utility function. In the proposed method, the end point of the game is the Nash equilibrium point where players are no longer inclined to alter their bid for that resource and the final bid also satisfies the auctioneer's utility function. To prove the response space convexity, the Lagrange method is used and the proposed model is simulated in the cloudsim and the results are compared with previous work. At the end, it is concluded that this method converges to a response in a shorter time, provides the lowest service level agreement violations and the most utility to the provider.

  12. Computer modelling of the UK wind energy resource: UK wind speed data package and user manual

    Energy Technology Data Exchange (ETDEWEB)

    Burch, S F; Ravenscroft, F

    1993-12-31

    A software package has been developed for IBM-PC or true compatibles. It is designed to provide easy access to the results of a programme of work to estimate the UK wind energy resource. Mean wind speed maps and quantitative resource estimates were obtained using the NOABL mesoscale (1 km resolution) numerical model for the prediction of wind flow over complex terrain. NOABL was used in conjunction with digitised terrain data and wind data from surface meteorological stations for a ten year period (1975-1984) to provide digital UK maps of mean wind speed at 10m, 25m and 45m above ground level. Also included in the derivation of these maps was the use of the Engineering Science Data Unit (ESDU) method to model the effect on wind speed of the abrupt change in surface roughness that occurs at the coast. With the wind speed software package, the user is able to obtain a display of the modelled wind speed at 10m, 25m and 45m above ground level for any location in the UK. The required co-ordinates are simply supplied by the user, and the package displays the selected wind speed. This user manual summarises the methodology used in the generation of these UK maps and shows computer generated plots of the 25m wind speeds in 200 x 200 km regions covering the whole UK. The uncertainties inherent in the derivation of these maps are also described, and notes given on their practical usage. The present study indicated that 23% of the UK land area had speeds over 6 m/s, with many hill sites having 10m speeds over 10 m/s. It is concluded that these `first order` resource estimates represent a substantial improvement over the presently available `zero order` estimates. (18 figures, 3 tables, 6 references). (author)

  13. Monitoring of computing resource use of active software releases at ATLAS

    Science.gov (United States)

    Limosani, Antonio; ATLAS Collaboration

    2017-10-01

    The LHC is the world’s most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the TierO at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end-user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as “MemoryMonitor”, to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and displayed in plots generated using Python visualization libraries and collected into pre-formatted auto-generated Web pages, which allow the ATLAS developer community to track the performance of their algorithms. This information is however preferentially filtered to domain leaders and developers through the use of JIRA and via reports given at ATLAS software meetings. Finally, we take a glimpse of the future by reporting on the expected CPU and RAM usage in benchmark workflows associated with the High Luminosity LHC and anticipate the ways performance monitoring will evolve to understand and benchmark future workflows.

  14. U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center-fiscal year 2010 annual report

    Science.gov (United States)

    Nelson, Janice S.

    2011-01-01

    The Earth Resources Observation and Science (EROS) Center is a U.S. Geological Survey (USGS) facility focused on providing science and imagery to better understand our Earth. The work of the Center is shaped by the earth sciences, the missions of our stakeholders, and implemented through strong program and project management, and application of state-of-the-art information technologies. Fundamentally, EROS contributes to the understanding of a changing Earth through 'research to operations' activities that include developing, implementing, and operating remote-sensing-based terrestrial monitoring capabilities needed to address interdisciplinary science and applications objectives at all levels-both nationally and internationally. The Center's programs and projects continually strive to meet, and where possible exceed, the changing needs of the USGS, the Department of the Interior, our Nation, and international constituents. The Center's multidisciplinary staff uses their unique expertise in remote sensing science and technologies to conduct basic and applied research, data acquisition, systems engineering, information access and management, and archive preservation to address the Nation's most critical needs. Of particular note is the role of EROS as the primary provider of Landsat data, the longest comprehensive global land Earth observation record ever collected. This report is intended to provide an overview of the scientific and engineering achievements and illustrate the range and scope of the activities and accomplishments at EROS throughout fiscal year (FY) 2010. Additional information concerning the scientific, engineering, and operational achievements can be obtained from the scientific papers and other documents published by EROS staff or by visiting our web site at http://eros.usgs.gov. We welcome comments and follow-up questions on any aspect of this Annual Report and invite any of our customers or partners to contact us at their convenience. To

  15. Whatever works: a systematic user-centered training protocol to optimize brain-computer interfacing individually.

    Directory of Open Access Journals (Sweden)

    Elisabeth V C Friedrich

    Full Text Available This study implemented a systematic user-centered training protocol for a 4-class brain-computer interface (BCI. The goal was to optimize the BCI individually in order to achieve high performance within few sessions for all users. Eight able-bodied volunteers, who were initially naïve to the use of a BCI, participated in 10 sessions over a period of about 5 weeks. In an initial screening session, users were asked to perform the following seven mental tasks while multi-channel EEG was recorded: mental rotation, word association, auditory imagery, mental subtraction, spatial navigation, motor imagery of the left hand and motor imagery of both feet. Out of these seven mental tasks, the best 4-class combination as well as most reactive frequency band (between 8-30 Hz was selected individually for online control. Classification was based on common spatial patterns and Fisher's linear discriminant analysis. The number and time of classifier updates varied individually. Selection speed was increased by reducing trial length. To minimize differences in brain activity between sessions with and without feedback, sham feedback was provided in the screening and calibration runs in which usually no real-time feedback is shown. Selected task combinations and frequency ranges differed between users. The tasks that were included in the 4-class combination most often were (1 motor imagery of the left hand (2, one brain-teaser task (word association or mental subtraction (3, mental rotation task and (4 one more dynamic imagery task (auditory imagery, spatial navigation, imagery of the feet. Participants achieved mean performances over sessions of 44-84% and peak performances in single-sessions of 58-93% in this user-centered 4-class BCI protocol. This protocol is highly adjustable to individual users and thus could increase the percentage of users who can gain and maintain BCI control. A high priority for future work is to examine this protocol with severely

  16. A resource facility for kinetic analysis: modeling using the SAAM computer programs.

    Science.gov (United States)

    Foster, D M; Boston, R C; Jacquez, J A; Zech, L

    1989-01-01

    Kinetic analysis and integrated system modeling have contributed significantly to understanding the physiology and pathophysiology of metabolic systems in humans and animals. Many experimental biologists are aware of the usefulness of these techniques and recognize that kinetic modeling requires special expertise. The Resource Facility for Kinetic Analysis (RFKA) provides this expertise through: (1) development and application of modeling technology for biomedical problems, and (2) development of computer-based kinetic modeling methodologies concentrating on the computer program Simulation, Analysis, and Modeling (SAAM) and its conversational version, CONversational SAAM (CONSAM). The RFKA offers consultation to the biomedical community in the use of modeling to analyze kinetic data and trains individuals in using this technology for biomedical research. Early versions of SAAM were widely applied in solving dosimetry problems; many users, however, are not familiar with recent improvements to the software. The purpose of this paper is to acquaint biomedical researchers in the dosimetry field with RFKA, which, together with the joint National Cancer Institute-National Heart, Lung and Blood Institute project, is overseeing SAAM development and applications. In addition, RFKA provides many service activities to the SAAM user community that are relevant to solving dosimetry problems.

  17. A comprehensive overview of computational resources to aid in precision genome editing with engineered nucleases.

    Science.gov (United States)

    Periwal, Vinita

    2017-07-01

    Genome editing with engineered nucleases (zinc finger nucleases, TAL effector nucleases s and Clustered regularly inter-spaced short palindromic repeats/CRISPR-associated) has recently been shown to have great promise in a variety of therapeutic and biotechnological applications. However, their exploitation in genetic analysis and clinical settings largely depends on their specificity for the intended genomic target. Large and complex genomes often contain highly homologous/repetitive sequences, which limits the specificity of genome editing tools and could result in off-target activity. Over the past few years, various computational approaches have been developed to assist the design process and predict/reduce the off-target activity of these nucleases. These tools could be efficiently used to guide the design of constructs for engineered nucleases and evaluate results after genome editing. This review provides a comprehensive overview of various databases, tools, web servers and resources for genome editing and compares their features and functionalities. Additionally, it also describes tools that have been developed to analyse post-genome editing results. The article also discusses important design parameters that could be considered while designing these nucleases. This review is intended to be a quick reference guide for experimentalists as well as computational biologists working in the field of genome editing with engineered nucleases. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Rethinking Human-Centered Computing: Finding the Customer and Negotiated Interactions at the Airport

    Science.gov (United States)

    Wales, Roxana; O'Neill, John; Mirmalek, Zara

    2003-01-01

    The breakdown in the air transportation system over the past several years raises an interesting question for researchers: How can we help improve the reliability of airline operations? In offering some answers to this question, we make a statement about Huuman-Centered Computing (HCC). First we offer the definition that HCC is a multi-disciplinary research and design methodology focused on supporting humans as they use technology by including cognitive and social systems, computational tools and the physical environment in the analysis of organizational systems. We suggest that a key element in understanding organizational systems is that there are external cognitive and social systems (customers) as well as internal cognitive and social systems (employees) and that they interact dynamically to impact the organization and its work. The design of human-centered intelligent systems must take this outside-inside dynamic into account. In the past, the design of intelligent systems has focused on supporting the work and improvisation requirements of employees but has often assumed that customer requirements are implicitly satisfied by employee requirements. Taking a customer-centric perspective provides a different lens for understanding this outside-inside dynamic, the work of the organization and the requirements of both customers and employees In this article we will: 1) Demonstrate how the use of ethnographic methods revealed the important outside-inside dynamic in an airline, specifically the consequential relationship between external customer requirements and perspectives and internal organizational processes and perspectives as they came together in a changing environment; 2) Describe how taking a customer centric perspective identifies places where the impact of the outside-inside dynamic is most critical and requires technology that can be adaptive; 3) Define and discuss the place of negotiated interactions in airline operations, identifying how these

  19. Empowering patients of a mental rehabilitation center in a low-resource context: a Moroccan experience as a case study.

    Science.gov (United States)

    Khabbache, Hicham; Jebbar, Abdelhak; Rania, Nadia; Doucet, Marie-Chantal; Watfa, Ali Assad; Candau, Joël; Martini, Mariano; Siri, Anna; Brigo, Francesco; Bragazzi, Nicola Luigi

    2017-01-01

    Mental, neurological and substance use (MNS) disorders represent a major source of disability and premature mortality worldwide. However, in developing countries patients with MNS disorders are often poorly managed and treated, particularly in marginalized, impoverished areas where the mental health gap and the treatment gap can reach 90%. Efforts should be made in promoting help by making mental health care more accessible. In this article, we address the challenges that psychological and psychiatric services have to face in a low-resource context, taking our experience at a Moroccan rehabilitation center as a case study. A sample of 60 patients were interviewed using a semi-structured questionnaire during the period of 2014-2015. The questionnaire investigated the reactions and feelings of the patients to the rehabilitation program, and their perceived psychological status and mental improvement, if any. Interviews were then transcribed and processed using ATLAS.ti V.7.0 qualitative analysis software. Frequencies and co-occurrence analyses were carried out. Despite approximately 30 million inhabitants within the working age group, Morocco suffers from a shortage of specialized health workers. Our ethnographic observations show that psychiatric treatment can be ensured, notwithstanding these hurdles, if a public health perspective is assumed. In resource-limited settings, working in the field of mental health means putting oneself on the line, exposing oneself to new experiences, and reorganizing one's own skills and expertise. In the present article, we have used our clinical experience at a rehabilitation center in Fes as a case study and we have shown how to use peer therapy to overcome the drawbacks that we are encountered daily in a setting of limited resources.

  20. Design of SCADA water resource management control center by a bi-objective redundancy allocation problem and particle swarm optimization

    International Nuclear Information System (INIS)

    Dolatshahi-Zand, Ali; Khalili-Damghani, Kaveh

    2015-01-01

    SCADA is an essential system to control critical facilities in big cities. SCADA is utilized in several sectors such as water resource management, power plants, electricity distribution centers, traffic control centers, and gas deputy. The failure of SCADA results in crisis. Hence, the design of SCADA system in order to serve a high reliability considering limited budget and other constraints is essential. In this paper, a bi-objective redundancy allocation problem (RAP) is proposed to design Tehran's SCADA water resource management control center. Reliability maximization and cost minimization are concurrently considered. Since the proposed RAP is a non-linear multi-objective mathematical programming so the exact methods cannot efficiently handle it. A multi-objective particle swarm optimization (MOPSO) algorithm is designed to solve it. Several features such as dynamic parameter tuning, efficient constraint handling and Pareto gridding are inserted in proposed MOPSO. The results of proposed MOPSO are compared with an efficient ε-constraint method. Several non-dominated designs of SCADA system are generated using both methods. Comparison metrics based on accuracy and diversity of Pareto front are calculated for both methods. The proposed MOPSO algorithm reports better performance. Finally, in order to choose the practical design, the TOPSIS algorithm is used to prune the Pareto front. - Highlights: • Multi-objective redundancy allocation problem (MORAP) is proposed to design SCADA system. • Multi-objective particle swarm optimization (MOPSO) is proposed to solve MORAP. • Efficient epsilon-constraint method is adapted to solve MORAP. • Non-dominated solutions are generated on Pareto front of MORAP by both methods. • Several multi-objective metrics are calculated to compare the performance of methods

  1. Postoperative Central Nervous System Infection After Neurosurgery in a Modernized, Resource-Limited Tertiary Neurosurgical Center in South Asia.

    Science.gov (United States)

    Chidambaram, Swathi; Nair, M Nathan; Krishnan, Shyam Sundar; Cai, Ling; Gu, Weiling; Vasudevan, Madabushi Chakravarthy

    2015-12-01

    Postoperative central nervous system infections (PCNSIs) are rare but serious complications after neurosurgery. The purpose of this study was to examine the prevalence and causative pathogens of PCNSIs at a modernized, resource-limited neurosurgical center in South Asia. A retrospective analysis was conducted of the medical records of all 363 neurosurgical cases performed between June 1, 2012, and June 30, 2013, at a neurosurgical center in South Asia. Data from all operative neurosurgical cases during the 13-month period were included. Cerebrospinal fluid (CSF) analysis indicated that 71 of the 363 surgical cases had low CSF glucose or CSF leukocytosis. These 71 cases were categorized as PCNSIs. The PCNSIs with positive CSF cultures (9.86%) all had gram-negative bacteria with Pseudomonas aeruginosa (n = 5), Escherichia coli (n = 1), or Klebsiella pneumoniae (n = 1). The data suggest a higher rate of death (P = 0.031), a higher rate of CSF leak (P < 0.001), and a higher rate of cranial procedures (P < 0.001) among the infected patients and a higher rate of CSF leak among the patients with culture-positive infections (P = 0.038). This study summarizes the prevalence, causative organism of PCNSI, and antibiotic usage for all of the neurosurgical cases over a 13-month period in a modernized yet resource-limited neurosurgical center located in South Asia. The results from this study highlight the PCNSI landscape in an area of the world that is often underreported in the neurosurgical literature because of the paucity of clinical neurosurgical research undertaken there. This study shows an increasing prevalence of gram-negative organisms in CSF cultures from PCNSIs, which supports a trend in the recent literature of increasing gram-negative bacillary meningitis. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Readability of Online Patient Educational Resources Found on NCI-Designated Cancer Center Web Sites.

    Science.gov (United States)

    Rosenberg, Stephen A; Francis, David; Hullett, Craig R; Morris, Zachary S; Fisher, Michael M; Brower, Jeffrey V; Bradley, Kristin A; Anderson, Bethany M; Bassetti, Michael F; Kimple, Randall J

    2016-06-01

    The NIH and Department of Health & Human Services recommend online patient information (OPI) be written at a sixth grade level. We used a panel of readability analyses to assess OPI from NCI-Designated Cancer Center (NCIDCC) Web sites. Cancer.gov was used to identify 68 NCIDCC Web sites from which we collected both general OPI and OPI specific to breast, prostate, lung, and colon cancers. This text was analyzed by 10 commonly used readability tests: the New Dale-Chall Readability Formula, Flesch Reading Ease scale, Flesch-Kinaid Grade Level, FORCAST scale, Fry Readability Graph, Simple Measure of Gobbledygook test, Gunning Frequency of Gobbledygook index, New Fog Count, Raygor Readability Estimate Graph, and Coleman-Liau Index. We tested the hypothesis that the readability of NCIDCC OPI was written at the sixth grade level. Secondary analyses were performed to compare readability of OPI between comprehensive and noncomprehensive centers, by region, and to OPI produced by the American Cancer Society (ACS). A mean of 30,507 words from 40 comprehensive and 18 noncomprehensive NCIDCCs was analyzed (7 nonclinical and 3 without appropriate OPI were excluded). Using a composite grade level score, the mean readability score of 12.46 (ie, college level: 95% CI, 12.13-12.79) was significantly greater than the target grade level of 6 (middle-school: Preadability metrics (P<.05). ACS OPI provides easier language, at the seventh to ninth grade level, across all tests (P<.01). OPI from NCIDCC Web sites is more complex than recommended for the average patient. Copyright © 2016 by the National Comprehensive Cancer Network.

  3. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    Science.gov (United States)

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously

  4. Astigmatic single photon emission computed tomography imaging with a displaced center of rotation

    International Nuclear Information System (INIS)

    Wang, H.; Smith, M.F.; Stone, C.D.; Jaszczak, R.J.

    1998-01-01

    A filtered backprojection algorithm is developed for single photon emission computed tomography (SPECT) imaging with an astigmatic collimator having a displaced center of rotation. The astigmatic collimator has two perpendicular focal lines, one that is parallel to the axis of rotation of the gamma camera and one that is perpendicular to this axis. Using SPECT simulations of projection data from a hot rod phantom and point source arrays, it is found that a lack of incorporation of the mechanical shift in the reconstruction algorithm causes errors and artifacts in reconstructed SPECT images. The collimator and acquisition parameters in the astigmatic reconstruction formula, which include focal lengths, radius of rotation, and mechanical shifts, are often partly unknown and can be determined using the projections of a point source at various projection angles. The accurate determination of these parameters by a least squares fitting technique using projection data from numerically simulated SPECT acquisitions is studied. These studies show that the accuracy of parameter determination is improved as the distance between the point source and the axis of rotation of the gamma camera is increased. The focal length to the focal line perpendicular to the axis of rotation is determined more accurately than the focal length to the focal line parallel to this axis. copyright 1998 American Association of Physicists in Medicine

  5. Establishment of computed tomography reference dose levels in Onassis Cardiac Surgery Center

    International Nuclear Information System (INIS)

    Tsapaki, V.; Kyrozi, E.; Syrigou, T.; Mastorakou, I.; Kottou, S.

    2001-01-01

    The purpose of the study was to apply European Commission (EC) Reference Dose Levels (RDL) in Computed Tomography (CT) examinations at Onassis Cardiac Surgery Center (OCSC). These are weighted CT Dose Index (CTDI w ) for a single slice and Dose-Length Product (DLP) for a complete examination. During the period 1998-1999, the total number of CT examinations, every type of CT examination, patient related data and technical parameters of the examinations were recorded. The most frequent examinations were chosen for investigation which were the head, chest, abdomen and pelvis. CTDI measurements were performed and CTDI w and DLP were calculated. Third Quartile values of CTDI w were chosen to be 43mGy for head, 8mGy for chest, and 22mGy for abdomen and pelvis examinations. Third quartile values of DLP were chosen to be 740mGycm for head, 370mGycm for chest, 490mGycm for abdomen and 420mGycm for pelvis examination. Results confirm that OCSC follows successfully the proposed RDL for the head, chest, abdomen and pelvis examinations in terms of radiation dose. (author)

  6. Detailed prospective stages of inventory of U resources in Mentawa and Seruyan, Center of Kalimantan

    International Nuclear Information System (INIS)

    Ramadanus; Sudjiman; Agus, S.

    1996-01-01

    Indication of U mineralization in granite biotite 1.500 cps and metasilt boulders 500 cps - 15.000 cps was found in Mentawa River. This detailed examination was done with the aim to gather geological information and U mineralization and to obtain knowledge about potential U resources. Methods used were geological mapping, the radiometric measuring and peeling the chosen outcrop samples were taken from outcrop and anomaly boulders, and stream sediment as mud and heavy minerals. This research was backed up with laboratory analysis in the form of petrography, mineragraphy, autoradiography, total and mobile U content. The result of this research stratigraphy of Mentawa and Seruyan which consisted of schist and tonalite. Outcrop of U mineralization was found in schist in the from of uraninite generally, filled up SSE-NNW subvertical-vertical dipping. Boulder of U mineralizations was found in the from of uraninite, gumite and autonite associated with turmaline. Those U mineralizations mentioned were found in Mentawa Satu River and upper reaches of Rengka River with distribution width + 7 km 2

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  8. Fort Collins Science Center Ecosystem Dynamics branch--interdisciplinary research for addressing complex natural resource issues across landscapes and time

    Science.gov (United States)

    Bowen, Zachary H.; Melcher, Cynthia P.; Wilson, Juliette T.

    2013-01-01

    The Ecosystem Dynamics Branch of the Fort Collins Science Center offers an interdisciplinary team of talented and creative scientists with expertise in biology, botany, ecology, geology, biogeochemistry, physical sciences, geographic information systems, and remote-sensing, for tackling complex questions about natural resources. As demand for natural resources increases, the issues facing natural resource managers, planners, policy makers, industry, and private landowners are increasing in spatial and temporal scope, often involving entire regions, multiple jurisdictions, and long timeframes. Needs for addressing these issues include (1) a better understanding of biotic and abiotic ecosystem components and their complex interactions; (2) the ability to easily monitor, assess, and visualize the spatially complex movements of animals, plants, water, and elements across highly variable landscapes; and (3) the techniques for accurately predicting both immediate and long-term responses of system components to natural and human-caused change. The overall objectives of our research are to provide the knowledge, tools, and techniques needed by the U.S. Department of the Interior, state agencies, and other stakeholders in their endeavors to meet the demand for natural resources while conserving biodiversity and ecosystem services. Ecosystem Dynamics scientists use field and laboratory research, data assimilation, and ecological modeling to understand ecosystem patterns, trends, and mechanistic processes. This information is used to predict the outcomes of changes imposed on species, habitats, landscapes, and climate across spatiotemporal scales. The products we develop include conceptual models to illustrate system structure and processes; regional baseline and integrated assessments; predictive spatial and mathematical models; literature syntheses; and frameworks or protocols for improved ecosystem monitoring, adaptive management, and program evaluation. The descriptions

  9. Nuclear structure and radioactive decay resources at the US National Nuclear Data Center

    International Nuclear Information System (INIS)

    Sonzogni, A.A.; Burrows, T.W.; Pritychenko, B.; Tuli, J.K.; Winchell, D.F.

    2008-01-01

    The National Nuclear Data Center has a long tradition of evaluating nuclear structure and decay data as well as offering tools to assist in nuclear science research and applications. With these tools, users can obtain recommended values for nuclear structure and radioactive decay observables as well as links to the relevant articles. The main databases or tools are ENSDF, NSR, NuDat and the new Endf decay data library. The Evaluated Nuclear Structure Data File (ENSDF) stores recommended nuclear structure and decay data for all nuclei. ENSDF deals with properties such as: -) nuclear level energies, spin and parity, half-life and decay modes, -) nuclear radiation energy and intensity for different types, -) nuclear decay modes and their probabilities. The Nuclear Science References (NSR) is a bibliographic database containing nearly 200.000 nuclear sciences articles indexed according to content. About 4000 are added each year covering 80 journals as well as conference proceedings and laboratory reports. NuDat is a software product with 2 main goals, to present nuclear structure and decay information from ENSDF in a user-friendly way and to allow users to execute complex search operations in the wealth of data contained in ENSDF. The recently released Endf-B7.0 contains a decay data sub-library which has been derived from ENSDF. The way all these databases and tools have been offered to the public has undergone a drastic improvement due to advancements in information technology

  10. Iowa State University's undergraduate minor, online graduate certificate and resource center in NDE

    Science.gov (United States)

    Bowler, Nicola; Larson, Brian F.; Gray, Joseph N.

    2014-02-01

    Nondestructive evaluation is a `niche' subject that is not yet offered as an undergraduate or graduate major in the United States. The undergraduate minor in NDE offered within the College of Engineering at Iowa State University (ISU) provides a unique opportunity for undergraduate aspiring engineers to obtain a qualification in the multi-disciplinary subject of NDE. The minor requires 16 credits of course work within which a core course and laboratory in NDE are compulsory. The industrial sponsors of Iowa State's Center for Nondestructive Evaluation, and others, strongly support the NDE minor and actively recruit students from this pool. Since 2007 the program has graduated 10 students per year and enrollment is rising. In 2011, ISU's College of Engineering established an online graduate certificate in NDE, accessible not only to campus-based students but also to practicing engineers via the web. The certificate teaches the fundamentals of three major NDE techniques; eddy-current, ultrasonic and X-ray methods. This paper describes the structure of these programs and plans for development of an online, coursework-only, Master of Engineering in NDE and thesis-based Master of Science degrees in NDE.

  11. RESOURCE TRAINING AND METHODOLOGICAL CENTER FOR THE TRAINING OF PEOPLE WITH DISABILITIES: EXPERIENCE AND DIRECTION OF DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    A. A. Fedorov

    2018-01-01

    Full Text Available Introduction: The presented article is devoted to the new and actual direction in the system of higher education - the development of inclusive education. The experience of creating a resource training and methodological center (RТMC of the University of Minin in 2017 is presented, the directions of its activity in 2017 and the results are described. The article outlines the role of RТMC in the development of inclusive culture.Materials and methods: The method of analyzing the literature of domestic and foreign authors was used as the basis for writing the article; the monitoring data of the state of inclusive higher education, which was implemented within the framework of the State Contract dated 07.06.2016 No. 05.020.11 007 on the project «Monitoring Information and Analytical Support of Activities regional resource centers for higher education for disabled people».Results: Analyzing the results of the RТMC activity, the authors update the problems that arose during the project implementation and suggest ways of their solution. The authors see the development of the RТMC activity through the development of forms and mechanisms of interdepartmental, interregional and inter-institutional cooperation in order to achieve coherence of actions and effectiveness of all participants in the support of inclusion in higher education, taking into account the educational needs of entrants and labor market needs throughout the fixed territory. As a special mission of the RТMC, the authors see the management of the development of inclusive culture in the university. The system of higher education is considered as an instrument of fulfilling the social order for the formation of a generation of people who tolerate and organically perceive the fact of inclusion in all spheres of life.Discussion and conclusion: The role of the resource training and methodological center in the development of inclusive higher education is determined by the identification

  12. Pricing the Services of the Computer Center at the Catholic University of Louvain. Program on Institutional Management in Higher Education.

    Science.gov (United States)

    Hecquet, Ignace; And Others

    Principles are outlined that are used as a basis for the system of pricing the services of the Computer Centre. The system illustrates the use of a management method to secure better utilization of university resources. Departments decide how to use the appropriations granted to them and establish a system of internal prices that reflect the cost…

  13. Modeling of Groundwater Resources Heavy Metals Concentration Using Soft Computing Methods: Application of Different Types of Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Meysam Alizamir

    2017-09-01

    Full Text Available Nowadays, groundwater resources play a vital role as a source of drinking water in arid and semiarid regions and forecasting of pollutants content in these resources is very important. Therefore, this study aimed to compare two soft computing methods for modeling Cd, Pb and Zn concentration in groundwater resources of Asadabad Plain, Western Iran. The relative accuracy of several soft computing models, namely multi-layer perceptron (MLP and radial basis function (RBF for forecasting of heavy metals concentration have been investigated. In addition, Levenberg-Marquardt, gradient descent and conjugate gradient training algorithms were utilized for the MLP models. The ANN models for this study were developed using MATLAB R 2014 Software program. The MLP performs better than the other models for heavy metals concentration estimation. The simulation results revealed that MLP model was able to model heavy metals concentration in groundwater resources favorably. It generally is effectively utilized in environmental applications and in the water quality estimations. In addition, out of three algorithms, Levenberg-Marquardt was better than the others were. This study proposed soft computing modeling techniques for the prediction and estimation of heavy metals concentration in groundwater resources of Asadabad Plain. Based on collected data from the plain, MLP and RBF models were developed for each heavy metal. MLP can be utilized effectively in applications of prediction of heavy metals concentration in groundwater resources of Asadabad Plain.

  14. Fiscal Year 1987 program report: Rhode Island Water Resource Research Center

    International Nuclear Information System (INIS)

    Poon, P.C.

    1988-07-01

    The 1987 program objective was to conduct studies and research of value to the New England region as well as to assist in the solution of problems in the State of Rhode Island. Current and anticipated state and regional-water problems are contamination of surface and groundwater by natural radioactivity such as radon, by chemicals from industrial and agricultural activities, septic tank and leach field, improperly managed landfills and the lack of public awareness and public participation in water-quality protection and management. It was found in the 1987 program that an epithermal neutron-activation analysis was best suitable for measuring uranium and thorium of which radon is the decayed product. Lower U and Th were found in calc-alkalic and mafic volcanic rocks while higher concentrations were found in the alkalic and peraluminous rocks. A computer model using finite-element method to simulate fluid flows through fractured porous media was developed for predicting the extent of ground-water contamination in the State

  15. Screening Genetic Resources of Capsicum Peppers in Their Primary Center of Diversity in Bolivia and Peru.

    Science.gov (United States)

    van Zonneveld, Maarten; Ramirez, Marleni; Williams, David E; Petz, Michael; Meckelmann, Sven; Avila, Teresa; Bejarano, Carlos; Ríos, Llermé; Peña, Karla; Jäger, Matthias; Libreros, Dimary; Amaya, Karen; Scheldeman, Xavier

    2015-01-01

    For most crops, like Capsicum, their diversity remains under-researched for traits of interest for food, nutrition and other purposes. A small investment in screening this diversity for a wide range of traits is likely to reveal many traditional varieties with distinguished values. One objective of this study was to demonstrate, with Capsicum as model crop, the application of indicators of phenotypic and geographic diversity as effective criteria for selecting promising genebank accessions for multiple uses from crop centers of diversity. A second objective was to evaluate the expression of biochemical and agromorphological properties of the selected Capsicum accessions in different conditions. Four steps were involved: 1) Develop the necessary diversity by expanding genebank collections in Bolivia and Peru; 2) Establish representative subsets of ~100 accessions for biochemical screening of Capsicum fruits; 3) Select promising accessions for different uses after screening; and 4) Examine how these promising accessions express biochemical and agromorphological properties when grown in different environmental conditions. The Peruvian Capsicum collection now contains 712 accessions encompassing all five domesticated species (C. annuum, C. chinense, C. frutescens, C. baccatum, and C. pubescens). The collection in Bolivia now contains 487 accessions, representing all five domesticates plus four wild taxa (C. baccatum var. baccatum, C. caballeroi, C. cardenasii, and C. eximium). Following the biochemical screening, 44 Bolivian and 39 Peruvian accessions were selected as promising, representing wide variation in levels of antioxidant capacity, capsaicinoids, fat, flavonoids, polyphenols, quercetins, tocopherols, and color. In Peru, 23 promising accessions performed well in different environments, while each of the promising Bolivian accessions only performed well in a certain environment. Differences in Capsicum diversity and local contexts led to distinct outcomes in

  16. Building an application for computing the resource requests such as disk, CPU, and tape and studying the time evolution of computing model

    CERN Document Server

    Noormandipour, Mohammad Reza

    2017-01-01

    The goal of this project was building an application to calculate the computing resources needed by the LHCb experiment for data processing and analysis, and to predict their evolution in future years. The source code was developed in the Python programming language and the application built and developed in CERN GitLab. This application will facilitate the calculation of resources required by LHCb in both qualitative and quantitative aspects. The granularity of computations is improved to a weekly basis, in contrast with the yearly basis used so far. The LHCb computing model will benefit from the new possibilities and options added, as the new predictions and calculations are aimed at giving more realistic and accurate estimates.

  17. Cross stratum resources protection in fog-computing-based radio over fiber networks for 5G services

    Science.gov (United States)

    Guo, Shaoyong; Shao, Sujie; Wang, Yao; Yang, Hui

    2017-09-01

    In order to meet the requirement of internet of things (IoT) and 5G, the cloud radio access network is a paradigm which converges all base stations computational resources into a cloud baseband unit (BBU) pool, while the distributed radio frequency signals are collected by remote radio head (RRH). A precondition for centralized processing in the BBU pool is an interconnection fronthaul network with high capacity and low delay. However, it has become more complex and frequent in the interaction between RRH and BBU and resource scheduling among BBUs in cloud. Cloud radio over fiber network has been proposed in our previous work already. In order to overcome the complexity and latency, in this paper, we first present a novel cross stratum resources protection (CSRP) architecture in fog-computing-based radio over fiber networks (F-RoFN) for 5G services. Additionally, a cross stratum protection (CSP) scheme considering the network survivability is introduced in the proposed architecture. The CSRP with CSP scheme can effectively pull the remote processing resource locally to implement the cooperative radio resource management, enhance the responsiveness and resilience to the dynamic end-to-end 5G service demands, and globally optimize optical network, wireless and fog resources. The feasibility and efficiency of the proposed architecture with CSP scheme are verified on our software defined networking testbed in terms of service latency, transmission success rate, resource occupation rate and blocking probability.

  18. [Establish and manage a National Resource Center for plutonium, Quarterly report, April 1, 1995--June 30, 1995

    Energy Technology Data Exchange (ETDEWEB)

    Mulder, R.

    1995-06-27

    The initial phase of the Plutonium Information Resource is well under way. Board members developed linkages with Russian scientists and engineers and obtained names of technical team members. Nuclear proposals were reviewed by the Nuclear Review Group, and the proposals were modified to incorporate the review group`s comments. Portions of the proposals were approved by the Governing Board. Proposals for education and outreach were reviewed by the Education Proposal Review Group, considered by the Governing Board and approved. The Senior Technical Review Group met to consider the R&D programs associated with fissile materials disposal. A newsletter was published. Progress continued on the high explosives demonstration project, on site-specific environmental work, and the multiattribute utility analysis. Center offices in Amarillo were furnished, equipment was purchased, and the lease was modified.

  19. U.S. Department of Energy Regional Resource Centers Report: State of the Wind Industry in the Regions

    Energy Technology Data Exchange (ETDEWEB)

    Baranowski, Ruth [National Renewable Energy Lab. (NREL), Golden, CO (United St; Oteri, Frank [National Renewable Energy Lab. (NREL), Golden, CO (United St; Baring-Gould, Ian [National Renewable Energy Lab. (NREL), Golden, CO (United St; Tegen, Suzanne [National Renewable Energy Lab. (NREL), Golden, CO (United St

    2016-03-01

    The wind industry and the U.S. Department of Energy (DOE) are addressing technical challenges to increasing wind energy's contribution to the national grid (such as reducing turbine costs and increasing energy production and reliability), and they recognize that public acceptance issues can be challenges for wind energy deployment. Wind project development decisions are best made using unbiased information about the benefits and impacts of wind energy. In 2014, DOE established six wind Regional Resource Centers (RRCs) to provide information about wind energy, focusing on regional qualities. This document summarizes the status and drivers for U.S. wind energy development on regional and state levels. It is intended to be a companion to DOE's 2014 Distributed Wind Market Report, 2014 Wind Technologies Market Report, and 2014 Offshore Wind Market and Economic Analysis that provide assessments of the national wind markets for each of these technologies.

  20. [Establish and manage a National Resource Center for plutonium, Quarterly report, April 1, 1995--June 30, 1995

    International Nuclear Information System (INIS)

    Mulder, R.

    1995-01-01

    The initial phase of the Plutonium Information Resource is well under way. Board members developed linkages with Russian scientists and engineers and obtained names of technical team members. Nuclear proposals were reviewed by the Nuclear Review Group, and the proposals were modified to incorporate the review group's comments. Portions of the proposals were approved by the Governing Board. Proposals for education and outreach were reviewed by the Education Proposal Review Group, considered by the Governing Board and approved. The Senior Technical Review Group met to consider the R ampersand D programs associated with fissile materials disposal. A newsletter was published. Progress continued on the high explosives demonstration project, on site-specific environmental work, and the multiattribute utility analysis. Center offices in Amarillo were furnished, equipment was purchased, and the lease was modified

  1. U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center-Fiscal Year 2009 Annual Report

    Science.gov (United States)

    Nelson, Janice S.

    2010-01-01

    The Earth Resources Observation and Science (EROS) Center is a U.S. Geological Survey (USGS) facility focused on providing science and imagery to better understand our Earth. As part of the USGS Geography Discipline, EROS contributes to the Land Remote Sensing (LRS) Program, the Geographic Analysis and Monitoring (GAM) Program, and the National Geospatial Program (NGP), as well as our Federal partners and cooperators. The work of the Center is shaped by the Earth sciences, the missions of our stakeholders, and implemented through strong program and project management and application of state-of-the-art information technologies. Fundamentally, EROS contributes to the understanding of a changing Earth through 'research to operations' activities that include developing, implementing, and operating remote sensing based terrestrial monitoring capabilities needed to address interdisciplinary science and applications objectives at all levels-both nationally and internationally. The Center's programs and projects continually strive to meet and/or exceed the changing needs of the USGS, the Department of the Interior, our Nation, and international constituents. The Center's multidisciplinary staff uses their unique expertise in remote sensing science and technologies to conduct basic and applied research, data acquisition, systems engineering, information access and management, and archive preservation to address the Nation's most critical needs. Of particular note is the role of EROS as the primary provider of Landsat data, the longest comprehensive global land Earth observation record ever collected. This report is intended to provide an overview of the scientific and engineering achievements and illustrate the range and scope of the activities and accomplishments at EROS throughout fiscal year (FY) 2009. Additional information concerning the scientific, engineering, and operational achievements can be obtained from the scientific papers and other documents published by

  2. Fish bone foreign body presenting with an acute fulminating retropharyngeal abscess in a resource-challenged center: a case report

    Directory of Open Access Journals (Sweden)

    Oyewole Ezekiel O

    2011-04-01

    Full Text Available Abstract Introduction A retropharyngeal abscess is a potentially life-threatening infection in the deep space of the neck, which can compromise the airway. Its management requires highly specialized care, including surgery and intensive care, to reduce mortality. This is the first case of a gas-forming abscess reported from this region, but not the first such report in the literature. Case presentation We present a case of a 16-month-old Yoruba baby girl with a gas-forming retropharyngeal abscess secondary to fish bone foreign body with laryngeal spasm that was managed in the recovery room. We highlight specific problems encountered in the management of this case in a resource-challenged center such as ours. Conclusion We describe an unusual presentation of a gas-forming organism causing a retropharyngeal abscess in a child. The patient's condition was treated despite the challenges of inadequate resources for its management. We recommend early recognition through adequate evaluation of any oropharyngeal injuries or infection and early referral to the specialist with prompt surgical intervention.

  3. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE Model of Water Resources and Water Environments

    Directory of Open Access Journals (Sweden)

    Guohua Fang

    2016-09-01

    Full Text Available To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and output sources of the National Economic Production Department. Secondly, an extended Social Accounting Matrix (SAM of Jiangsu province is developed to simulate various scenarios. By changing values of the discharge fees (increased by 50%, 100% and 150%, three scenarios are simulated to examine their influence on the overall economy and each industry. The simulation results show that an increased fee will have a negative impact on Gross Domestic Product (GDP. However, waste water may be effectively controlled. Also, this study demonstrates that along with the economic costs, the increase of the discharge fee will lead to the upgrading of industrial structures from a situation of heavy pollution to one of light pollution which is beneficial to the sustainable development of the economy and the protection of the environment.

  4. Disposal of waste computer hard disk drive: data destruction and resources recycling.

    Science.gov (United States)

    Yan, Guoqing; Xue, Mianqiang; Xu, Zhenming

    2013-06-01

    An increasing quantity of discarded computers is accompanied by a sharp increase in the number of hard disk drives to be eliminated. A waste hard disk drive is a special form of waste electrical and electronic equipment because it holds large amounts of information that is closely connected with its user. Therefore, the treatment of waste hard disk drives is an urgent issue in terms of data security, environmental protection and sustainable development. In the present study the degaussing method was adopted to destroy the residual data on the waste hard disk drives and the housing of the disks was used as an example to explore the coating removal process, which is the most important pretreatment for aluminium alloy recycling. The key operation points of the degaussing determined were: (1) keep the platter plate parallel with the magnetic field direction; and (2) the enlargement of magnetic field intensity B and action time t can lead to a significant upgrade in the degaussing effect. The coating removal experiment indicated that heating the waste hard disk drives housing at a temperature of 400 °C for 24 min was the optimum condition. A novel integrated technique for the treatment of waste hard disk drives is proposed herein. This technique offers the possibility of destroying residual data, recycling the recovered resources and disposing of the disks in an environmentally friendly manner.

  5. Increasing efficiency of job execution with resource co-allocation in distributed computer systems

    OpenAIRE

    Cankar, Matija

    2014-01-01

    The field of distributed computer systems, while not new in computer science, is still the subject of a lot of interest in both industry and academia. More powerful computers, faster and more ubiquitous networks, and complex distributed applications are accelerating the growth of distributed computing. Large numbers of computers interconnected in a single network provide additional computing power to users whenever required. Such systems are, however, expensive and complex to manage, which ca...

  6. A Two-Tier Energy-Aware Resource Management for Virtualized Cloud Computing System

    Directory of Open Access Journals (Sweden)

    Wei Huang

    2016-01-01

    Full Text Available The economic costs caused by electric power take the most significant part in total cost of data center; thus energy conservation is an important issue in cloud computing system. One well-known technique to reduce the energy consumption is the consolidation of Virtual Machines (VMs. However, it may lose some performance points on energy saving and the Quality of Service (QoS for dynamic workloads. Fortunately, Dynamic Frequency and Voltage Scaling (DVFS is an efficient technique to save energy in dynamic environment. In this paper, combined with the DVFS technology, we propose a cooperative two-tier energy-aware management method including local DVFS control and global VM deployment. The DVFS controller adjusts the frequencies of homogenous processors in each server at run-time based on the practical energy prediction. On the other hand, Global Scheduler assigns VMs onto the designate servers based on the cooperation with the local DVFS controller. The final evaluation results demonstrate the effectiveness of our two-tier method in energy saving.

  7. Water-resources and land-surface deformation evaluation studies at Fort Irwin National Training Center, Mojave Desert, California

    Science.gov (United States)

    Densmore-Judy, Jill; Dishart, Justine E.; Miller, David; Buesch, David C.; Ball, Lyndsay B.; Bedrosian, Paul A.; Woolfenden, Linda R.; Cromwell, Geoffrey; Burgess, Matthew K.; Nawikas, Joseph; O'Leary, David; Kjos, Adam; Sneed, Michelle; Brandt, Justin

    2017-01-01

    The U.S. Army Fort Irwin National Training Center (NTC), in the Mojave Desert, obtains all of its potable water supply from three groundwater basins (Irwin, Langford, and Bicycle) within the NTC boundaries (fig. 1; California Department of Water Resources, 2003). Because of increasing water demands at the NTC, the U.S. Geological Survey (USGS), in cooperation with the U.S. Army, completed several studies to evaluate water resources in the developed and undeveloped groundwater basins underlying the NTC. In all of the developed basins, groundwater withdrawals exceed natural recharge, resulting in water-level declines. However, artificial recharge of treated wastewater has had some success in offsetting water-level declines in Irwin Basin. Additionally, localized water-quality changes have occurred in some parts of Irwin Basin as a result of human activities (i.e., wastewater disposal practices, landscape irrigation, and/or leaking pipes). As part of the multi-faceted NTC-wide studies, traditional datacollection methods were used and include lithological and geophysical logging at newly drilled boreholes, hydrologic data collection (i.e. water-level, water-quality, aquifer tests, wellbore flow). Because these data cover a small portion of the 1,177 square-mile (mi2 ) NTC, regional mapping, including geologic, gravity, aeromagnetic, and InSAR, also were done. In addition, ground and airborne electromagnetic surveys were completed and analyzed to provide more detailed subsurface information on a regional, base-wide scale. The traditional and regional ground and airborne data are being analyzed and will be used to help develop preliminary hydrogeologic framework and groundwater-flow models in all basins. This report is intended to provide an overview of recent water-resources and land-surface deformation studies at the NTC.

  8. Annual report of R and D activities in Center for Promotion of Computational Science and Engineering and Center for Computational Science and e-Systems from April 1, 2005 to March 31, 2006

    International Nuclear Information System (INIS)

    2007-03-01

    This report provides an overview of research and development activities in Center for Computational Science and Engineering (CCSE), JAERI in the former half of the fiscal year 2005 (April 1, 2005 - Sep. 30, 2006) and those in Center for Computational Science and e-Systems (CCSE), JAEA, in the latter half of the fiscal year 2005(Oct 1, 2005 - March 31, 2006). In the former half term, the activities have been performed by 5 research groups, Research Group for Computational Science in Atomic Energy, Research Group for Computational Material Science in Atomic Energy, R and D Group for Computer Science, R and D Group for Numerical Experiments, and Quantum Bioinformatics Group in CCSE. At the beginning of the latter half term, these 5 groups were integrated into two offices, Simulation Technology Research and Development Office and Computer Science Research and Development Office at the moment of the unification of JNC (Japan Nuclear Cycle Development Institute) and JAERI (Japan Atomic Energy Research Institute), and the latter-half term activities were operated by the two offices. A big project, ITBL (Information Technology Based Laboratory) project and fundamental computational research for atomic energy plant were performed mainly by two groups, the R and D Group for Computer Science and the Research Group for Computational Science in Atomic Energy in the former half term and their integrated office, Computer Science Research and Development Office in the latter half one, respectively. The main result was verification by using structure analysis for real plant executable on the Grid environment, and received Honorable Mentions of Analytic Challenge in the conference 'Supercomputing (SC05)'. The materials science and bioinformatics in atomic energy research field were carried out by three groups, Research Group for Computational Material Science in Atomic Energy, R and D Group for Computer Science, R and D Group for Numerical Experiments, and Quantum Bioinformatics

  9. The Development of an Individualized Instructional Program in Beginning College Mathematics Utilizing Computer Based Resource Units. Final Report.

    Science.gov (United States)

    Rockhill, Theron D.

    Reported is an attempt to develop and evaluate an individualized instructional program in pre-calculus college mathematics. Four computer based resource units were developed in the areas of set theory, relations and function, algebra, trigonometry, and analytic geometry. Objectives were determined by experienced calculus teachers, and…

  10. Are the resources adoptive for conducting team-based diabetes management clinics? An explorative study at primary health care centers in Muscat, Oman.

    Science.gov (United States)

    Al-Alawi, Kamila; Johansson, Helene; Al Mandhari, Ahmed; Norberg, Margareta

    2018-05-08

    AimThe aim of this study is to explore the perceptions among primary health center staff concerning competencies, values, skills and resources related to team-based diabetes management and to describe the availability of needed resources for team-based approaches. The diabetes epidemic challenges services available at primary health care centers in the Middle East. Therefore, there is a demand for evaluation of the available resources and team-based diabetes management in relation to the National Diabetes Management Guidelines. A cross-sectional study was conducted with 26 public primary health care centers in Muscat, the capital of Oman. Data were collected from manual and electronic resources as well as a questionnaire that was distributed to the physician-in-charge and diabetes management team members.FindingsThe study revealed significant differences between professional groups regarding how they perceived their own competencies, values and skills as well as available resources related to team-based diabetes management. The perceived competencies were high among all professions. The perceived team-related values and skills were also generally high but with overall lower recordings among the nurses. This pattern, along with the fact that very few nurses have specialized qualifications, is a barrier to providing team-based diabetes management. Participants indicated that there were sufficient laboratory resources; however, reported that pharmacological, technical and human resources were lacking. Further work should be done at public primary diabetes management clinics in order to fully implement team-based diabetes management.

  11. PredMP: A Web Resource for Computationally Predicted Membrane Proteins via Deep Learning

    KAUST Repository

    Wang, Sheng

    2018-02-06

    Experimental determination of membrane protein (MP) structures is challenging as they are often too large for nuclear magnetic resonance (NMR) experiments and difficult to crystallize. Currently there are only about 510 non-redundant MPs with solved structures in Protein Data Bank (PDB). To elucidate the MP structures computationally, we developed a novel web resource, denoted as PredMP (http://52.87.130.56:3001/#/proteinindex), that delivers one-dimensional (1D) annotation of the membrane topology and secondary structure, two-dimensional (2D) prediction of the contact/distance map, together with three-dimensional (3D) modeling of the MP structure in the lipid bilayer, for each MP target from a given model organism. The precision of the computationally constructed MP structures is leveraged by state-of-the-art deep learning methods as well as cutting-edge modeling strategies. In particular, (i) we annotate 1D property via DeepCNF (Deep Convolutional Neural Fields) that not only models complex sequence-structure relationship but also interdependency between adjacent property labels; (ii) we predict 2D contact/distance map through Deep Transfer Learning which learns the patterns as well as the complex relationship between contacts/distances and protein features from non-membrane proteins; and (iii) we model 3D structure by feeding its predicted contacts and secondary structure to the Crystallography & NMR System (CNS) suite combined with a membrane burial potential that is residue-specific and depth-dependent. PredMP currently contains more than 2,200 multi-pass transmembrane proteins (length<700 residues) from Human. These transmembrane proteins are classified according to IUPHAR/BPS Guide, which provides a hierarchical organization of receptors, channels, transporters, enzymes and other drug targets according to their molecular relationships and physiological functions. Among these MPs, we estimated that our approach could predict correct folds for 1

  12. Radiotherapy infrastructure and human resources in Switzerland. Present status and projected computations for 2020

    Energy Technology Data Exchange (ETDEWEB)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar [KSA-KSB, Kantonsspital Aarau, RadioOnkologieZentrum, Aarau (Switzerland); Zwahlen, Daniel [Kantonsspital Graubuenden, Department of Radiotherapy, Chur (Switzerland); Bodis, Stephan [KSA-KSB, Kantonsspital Aarau, RadioOnkologieZentrum, Aarau (Switzerland); University Hospital Zurich, Department of Radiation Oncology, Zurich (Switzerland)

    2016-09-15

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology ''Quantification of Radiation Therapy Infrastructure and Staffing'' guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO ''Health Economics in Radiation Oncology'' (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland. (orig.) [German] Ziel dieser Studie war es, den aktuellen Stand der Infrastruktur und Personalausstattung der

  13. User-centered design in brain-computer interfaces-a case study.

    Science.gov (United States)

    Schreuder, Martijn; Riccio, Angela; Risetti, Monica; Dähne, Sven; Ramsay, Andrew; Williamson, John; Mattia, Donatella; Tangermann, Michael

    2013-10-01

    The array of available brain-computer interface (BCI) paradigms has continued to grow, and so has the corresponding set of machine learning methods which are at the core of BCI systems. The latter have evolved to provide more robust data analysis solutions, and as a consequence the proportion of healthy BCI users who can use a BCI successfully is growing. With this development the chances have increased that the needs and abilities of specific patients, the end-users, can be covered by an existing BCI approach. However, most end-users who have experienced the use of a BCI system at all have encountered a single paradigm only. This paradigm is typically the one that is being tested in the study that the end-user happens to be enrolled in, along with other end-users. Though this corresponds to the preferred study arrangement for basic research, it does not ensure that the end-user experiences a working BCI. In this study, a different approach was taken; that of a user-centered design. It is the prevailing process in traditional assistive technology. Given an individual user with a particular clinical profile, several available BCI approaches are tested and - if necessary - adapted to him/her until a suitable BCI system is found. Described is the case of a 48-year-old woman who suffered from an ischemic brain stem stroke, leading to a severe motor- and communication deficit. She was enrolled in studies with two different BCI systems before a suitable system was found. The first was an auditory event-related potential (ERP) paradigm and the second a visual ERP paradigm, both of which are established in literature. The auditory paradigm did not work successfully, despite favorable preconditions. The visual paradigm worked flawlessly, as found over several sessions. This discrepancy in performance can possibly be explained by the user's clinical deficit in several key neuropsychological indicators, such as attention and working memory. While the auditory paradigm relies

  14. Computation of groundwater resources and recharge in Chithar River Basin, South India.

    Science.gov (United States)

    Subramani, T; Babu, Savithri; Elango, L

    2013-01-01

    Groundwater recharge and available groundwater resources in Chithar River basin, Tamil Nadu, India spread over an area of 1,722 km(2) have been estimated by considering various hydrological, geological, and hydrogeological parameters, such as rainfall infiltration, drainage, geomorphic units, land use, rock types, depth of weathered and fractured zones, nature of soil, water level fluctuation, saturated thickness of aquifer, and groundwater abstraction. The digital ground elevation models indicate that the regional slope of the basin is towards east. The Proterozoic (Post-Archaean) basement of the study area consists of quartzite, calc-granulite, crystalline limestone, charnockite, and biotite gneiss with or without garnet. Three major soil types were identified namely, black cotton, deep red, and red sandy soils. The rainfall intensity gradually decreases from west to east. Groundwater occurs under water table conditions in the weathered zone and fluctuates between 0 and 25 m. The water table gains maximum during January after northeast monsoon and attains low during October. Groundwater abstraction for domestic/stock and irrigational needs in Chithar River basin has been estimated as 148.84 MCM (million m(3)). Groundwater recharge due to monsoon rainfall infiltration has been estimated as 170.05 MCM based on the water level rise during monsoon period. It is also estimated as 173.9 MCM using rainfall infiltration factor. An amount of 53.8 MCM of water is contributed to groundwater from surface water bodies. Recharge of groundwater due to return flow from irrigation has been computed as 147.6 MCM. The static groundwater reserve in Chithar River basin is estimated as 466.66 MCM and the dynamic reserve is about 187.7 MCM. In the present scenario, the aquifer is under safe condition for extraction of groundwater for domestic and irrigation purposes. If the existing water bodies are maintained properly, the extraction rate can be increased in future about 10% to 15%.

  15. Certification of version 1.2 of the PORFLO-3 code for the WHC scientific and engineering computational center

    International Nuclear Information System (INIS)

    Kline, N.W.

    1994-01-01

    Version 1.2 of the PORFLO-3 Code has migrated from the Hanford Cray computer to workstations in the WHC Scientific and Engineering Computational Center. The workstation-based configuration and acceptance testing are inherited from the CRAY-based configuration. The purpose of this report is to document differences in the new configuration as compared to the parent Cray configuration, and summarize some of the acceptance test results which have shown that the migrated code is functioning correctly in the new environment

  16. Empowering patients of a mental rehabilitation center in a low-resource context: a Moroccan experience as a case study

    Directory of Open Access Journals (Sweden)

    Khabbache H

    2017-04-01

    Full Text Available Hicham Khabbache,1 Abdelhak Jebbar,2,* Nadia Rania,3,* Marie-Chantal Doucet,4 Ali Assad Watfa,5 Joël Candau,6 Mariano Martini,7 Anna Siri,8,* Francesco Brigo,9,10,* Nicola Luigi Bragazzi1,2,4–8,11,* 1Faculty of Literature and Humanistic Studies, Sais, Sidi Mohamed Ben Abdellah University, Fez, 2Faculty of Art and Humanities, Sultan Moulay Slimane University, Beni-Mellal, Morocco; 3School of Social Sciences, Department of Education Sciences, University of Genoa, Genova, Italy; 4Faculty of Human Sciences, School of Social Work, University of Québec-Montréal, Montreal, QC, Canada; 5Faculty of Education, Kuwait University, Kuwait City, Kuwait; 6Laboratory of Anthropology and Cognitive and Social Psychology, University of Nice Sophia Antipolis, Nice, France; 7Department of Health Sciences (DISSAL, Section of Bioethics, University of Genoa, 8UNESCO Chair “Health Anthropology, Biosphere and Healing Systems”, Genova, 9Department of Neurology, Franz Tappeiner Hospital, Merano, 10Department of Neurological, Biomedical, and Movement Sciences, University of Verona, Verona, 11School of Public Health, Department of Health Sciences (DISSAL, University of Genoa, Genova, Italy *These authors contributed equally to this work Abstract: Mental, neurological and substance use (MNS disorders represent a major source of disability and premature mortality worldwide. However, in developing countries patients with MNS disorders are often poorly managed and treated, particularly in marginalized, impoverished areas where the mental health gap and the treatment gap can reach 90%. Efforts should be made in promoting help by making mental health care more accessible. In this article, we address the challenges that psychological and psychiatric services have to face in a low-resource context, taking our experience at a Moroccan rehabilitation center as a case study. A sample of 60 patients were interviewed using a semi-structured questionnaire during the period of

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  18. Education, Outreach, and Diversity Partnerships and Science Education Resources From the Center for Multi-scale Modeling of Atmospheric Processes

    Science.gov (United States)

    Foster, S. Q.; Randall, D.; Denning, S.; Jones, B.; Russell, R.; Gardiner, L.; Hatheway, B.; Johnson, R. M.; Drossman, H.; Pandya, R.; Swartz, D.; Lanting, J.; Pitot, L.

    2007-12-01

    The need for improving the representation of cloud processes in climate models has been one of the most important limitations of the reliability of climate-change simulations. The new National Science Foundation- funded Center for Multi-scale Modeling of Atmospheric Processes (CMMAP) at Colorado State University (CSU) is a major research program addressing this problem over the next five years through a revolutionary new approach to representing cloud processes on their native scales, including the cloud-scale interactions among the many physical and chemical processes that are active in cloud systems. At the end of its first year, CMMAP has established effective partnerships between scientists, students, and teachers to meet its goals to: (1) provide first-rate graduate education in atmospheric science; (2) recruit diverse undergraduates into graduate education and careers in climate science; and (3) develop, evaluate, and disseminate educational resources designed to inform K-12 students, teachers, and the general public about the nature of the climate system, global climate change, and career opportunities in climate science. This presentation will describe the partners, our challenges and successes, and measures of achievement involved in the integrated suite of programs launched in the first year. They include: (1) a new high school Colorado Climate Conference drawing prestigious climate scientists to speak to students, (2) a summer Weather and Climate Workshop at CSU and the National Center for Atmospheric Research introducing K-12 teachers to Earth system science and a rich toolkit of teaching materials, (3) a program from CSU's Little Shop of Physics reaching 50 schools and 20,000 K-12 students through the new "It's Up In the Air" program, (4) expanded content, imagery, and interactives on clouds, weather, climate, and modeling for students, teachers, and the public on The Windows to the Universe web site at University Corporation for Atmospheric Research

  19. Sexual and Reproductive Health Services and Related Health Information on Pregnancy Resource Center Websites: A Statewide Content Analysis.

    Science.gov (United States)

    Swartzendruber, Andrea; Newton-Levinson, Anna; Feuchs, Ashley E; Phillips, Ashley L; Hickey, Jennifer; Steiner, Riley J

    Pregnancy resource centers (PRCs) are nonprofit organizations with a primary mission of promoting childbirth among pregnant women. Given a new state grant program to publicly fund PRCs, we analyzed Georgia PRC websites to describe advertised services and related health information. We systematically identified all accessible Georgia PRC websites available from April to June 2016. Entire websites were obtained and coded using defined protocols. Of 64 reviewed websites, pregnancy tests and testing (98%) and options counseling (84%) were most frequently advertised. However, 58% of sites did not provide notice that PRCs do not provide or refer for abortion, and 53% included false or misleading statements regarding the need to make a decision about abortion or links between abortion and mental health problems or breast cancer. Advertised contraceptive services were limited to counseling about natural family planning (3%) and emergency contraception (14%). Most sites (89%) did not provide notice that PRCs do not provide or refer for contraceptives. Two sites (3%) advertised unproven "abortion reversal" services. Approximately 63% advertised ultrasound examinations, 22% sexually transmitted infection testing, and 5% sexually transmitted infection treatment. None promoted consistent and correct condom use; 78% with content about condoms included statements that seemed to be designed to undermine confidence in condom effectiveness. Approximately 84% advertised educational programs, and 61% material resources. Georgia PRC websites contain high levels of false and misleading health information; the advertised services do not seem to align with prevailing medical guidelines. Public funding for PRCs, an increasing national trend, should be rigorously examined. Increased regulation may be warranted to ensure quality health information and services. Copyright © 2017 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.

  20. Resources and Operations Section

    International Nuclear Information System (INIS)

    Burgess, R.L.

    1978-01-01

    Progress is reported on the data resources group with regard to numeric information support; IBP data center; and geoecology project. Systems ecology studies consisted of nonlinear analysis-time delays in a host-parasite model; dispersal of seeds by animals; three-dimensional computer graphics in ecology; spatial heterogeneity in ecosystems; and analysis of forest structure. Progress is also reported on the national inventory of biological monitoring programs; ecological sciences information center; and educational activities