WorldWideScience

Sample records for computing asc software

  1. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    Energy Technology Data Exchange (ETDEWEB)

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  2. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  3. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    Energy Technology Data Exchange (ETDEWEB)

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management and software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.

  4. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  5. ASC-1

    DEFF Research Database (Denmark)

    Jakimoski, Goce; Khajuria, Samant

    2011-01-01

    uses leak extraction from diÆerent AES rounds to compute the key material that is XOR-ed with the message to compute the ciphertext. Unlike LEX, the ASC-1 operates in a CFB fashion to compute an authentication tag over the encrypted message. We argue that ASC-1 is secure by reducingits (IND-CCA , INT......The goal of the modes of operation for authenticated encryption is to achieve faster encryption and message authentication by performing both the encryption and the message authentication in a single pass as opposed to the traditional encrypt-then-mac approach, which requires two passes....... Unfortunately, the use of a block cipher as a building block limits the performance of the authenticated encryption schemes to at most one message block per block cipher evaluation. In this paper, we propose the authenticated encryption scheme ASC-1 (Authenticating Stream Cipher One). Similarly to LEX, ASC-1...

  6. Structure of AscE and Induced Burial Regions in AscE and AscG upon Formation of the Chaperone Needle-subunit Complex of Type III Secretion System in Aeromonas Hydrophila

    Energy Technology Data Exchange (ETDEWEB)

    Tan, Y.; Yu, H; Leung, K; Sivaraman, J; Mok, Y

    2008-01-01

    In the type III secretion system (T3SS) of Aeromonas hydrophila, the putative needle complex subunit AscF requires both putative chaperones AscE and AscG for formation of a ternary complex to avoid premature assembly. Here we report the crystal structure of AscE at 2.7 A resolution and the mapping of buried regions of AscE, AscG, and AscF in the AscEG and AscEFG complexes using limited protease digestion. The dimeric AscE is comprised of two helix-turn-helix monomers packed in an antiparallel fashion. The N-terminal 13 residues of AscE are buried only upon binding with AscG, but this region is found to be nonessential for the interaction. AscE functions as a monomer and can be coexpressed with AscG or with both AscG and AscF to form soluble complexes. The AscE binding region of AscG in the AscEG complex is identified to be within the N-terminal 61 residues of AscG. The exposed C-terminal substrate-binding region of AscG in the AscEG complex is induced to be buried only upon binding to AscF. However, the N-terminal 52 residues of AscF remain exposed even in the ternary AscEFG complex. On the other hand, the 35-residue C-terminal region of AscF in the complex is resistant to protease digestion in the AscEFG complex. Site-directed mutagenesis showed that two C-terminal hydrophobic residues, Ile83 and Leu84, of AscF are essential for chaperone binding.

  7. ASC FY17 Implementation Plan, Rev. 1

    Energy Technology Data Exchange (ETDEWEB)

    Hamilton, P. G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-06-14

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resources, including technical staff, hardware, simulation software, and computer science solutions.

  8. Software Correlator for Radioastron Mission

    Science.gov (United States)

    Likhachev, Sergey F.; Kostenko, Vladimir I.; Girin, Igor A.; Andrianov, Andrey S.; Rudnitskiy, Alexey G.; Zharov, Vladimir E.

    In this paper, we discuss the characteristics and operation of Astro Space Center (ASC) software FX correlator that is an important component of space-ground interferometer for Radioastron project. This project performs joint observations of compact radio sources using 10m space radio telescope (SRT) together with ground radio telescopes at 92, 18, 6 and 1.3 cm wavelengths. In this paper, we describe the main features of space-ground VLBI data processing of Radioastron project using ASC correlator. Quality of implemented fringe search procedure provides positive results without significant losses in correlated amplitude. ASC Correlator has a computational power close to real time operation. The correlator has a number of processing modes: “Continuum”, “Spectral Line”, “Pulsars”, “Giant Pulses”,“Coherent”. Special attention is paid to peculiarities of Radioastron space-ground VLBI data processing. The algorithms of time delay and delay rate calculation are also discussed, which is a matter of principle for data correlation of space-ground interferometers. During five years of Radioastron SRT successful operation, ASC correlator showed high potential of satisfying steady growing needs of current and future ground and space VLBI science. Results of ASC software correlator operation are demonstrated.

  9. Report of experiments and evidence for ASC L2 milestone 4467 : demonstration of a legacy application's path to exascale.

    Energy Technology Data Exchange (ETDEWEB)

    Curry, Matthew L.; Ferreira, Kurt Brian; Pedretti, Kevin Thomas Tauke; Leung, Vitus Joseph; Moreland, Kenneth D.; Lofstead, Gerald Fredrick, II; Gentile, Ann C. (Sandia National Laboratories, Livermore, CA); Klundt, Ruth Ann; Ward, H. Lee; Laros, James H., III; Hemmert, Karl Scott; Fabian, Nathan D.; Levenhagen, Michael J.; Barrett, Brian W.; Brightwell, Ronald Brian; Barrett, Richard; Wheeler, Kyle Bruce; Kelly, Suzanne Marie; Rodrigues, Arun F.; Brandt, James M. (Sandia National Laboratories, Livermore, CA); Thompson, David (Sandia National Laboratories, Livermore, CA); VanDyke, John P.; Oldfield, Ron A.; Tucker, Thomas (Open Grid Computing, Inc., Austin, TX); Vaughan, Courtenay Thomas

    2012-03-01

    This report documents thirteen of Sandia's contributions to the Computational Systems and Software Environment (CSSE) within the Advanced Simulation and Computing (ASC) program between fiscal years 2009 and 2012. It describes their impact on ASC applications. Most contributions are implemented in lower software levels allowing for application improvement without source code changes. Improvements are identified in such areas as reduced run time, characterizing power usage, and Input/Output (I/O). Other experiments are more forward looking, demonstrating potential bottlenecks using mini-application versions of the legacy codes and simulating their network activity on Exascale-class hardware. The purpose of this report is to prove that the team has completed milestone 4467-Demonstration of a Legacy Application's Path to Exascale. Cielo is expected to be the last capability system on which existing ASC codes can run without significant modifications. This assertion will be tested to determine where the breaking point is for an existing highly scalable application. The goal is to stretch the performance boundaries of the application by applying recent CSSE RD in areas such as resilience, power, I/O, visualization services, SMARTMAP, lightweight LWKs, virtualization, simulation, and feedback loops. Dedicated system time reservations and/or CCC allocations will be used to quantify the impact of system-level changes to extend the life and performance of the ASC code base. Finally, a simulation of anticipated exascale-class hardware will be performed using SST to supplement the calculations. Determine where the breaking point is for an existing highly scalable application: Chapter 15 presented the CSSE work that sought to identify the breaking point in two ASC legacy applications-Charon and CTH. Their mini-app versions were also employed to complete the task. There is no single breaking point as more than one issue was found with the two codes. The results were

  10. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  11. Pathfinding the Flight Advanced Stirling Convertor Design with the ASC-E3

    Science.gov (United States)

    Wong, Wayne A.; Wilson, Kyle; Smith, Eddie; Collins, Josh

    2012-01-01

    The Advanced Stirling Convertor (ASC) was initially developed by Sunpower, Inc. under contract to NASA Glenn Research Center (GRC) as a technology development project. The ASC technology fulfills NASA's need for high efficiency power convertors for future Radioisotope Power Systems (RPS). Early successful technology demonstrations between 2003 to 2005 eventually led to the expansion of the project including the decision in 2006 to use the ASC technology on the Advanced Stirling Radioisotope Generator (ASRG). Sunpower has delivered 22 ASC convertors of progressively mature designs to date to GRC. Currently, Sunpower with support from GRC, Lockheed Martin Space System Company (LMSSC), and the Department of Energy (DOE) is developing the flight ASC-F in parallel with the ASC-E3 pathfinders. Sunpower will deliver four pairs of ASC-E3 convertors to GRC which will be used for extended operation reliability assessment, independent validation and verification testing, system interaction tests, and to support LMSSC controller verification. The ASC-E3 and -F convertors are being built to the same design and processing documentation and the same product specification. The initial two pairs of ASC-E3 are built before the flight units and will validate design and processing changes prior to implementation on the ASC-F flight convertors. This paper provides a summary on development of the ASC technology and the status of the ASC-E3 build and how they serve the vital pathfinder role ahead of the flight build for ASRG. The ASRG is part of two of the three candidate missions being considered for selection for the Discovery 12 mission.

  12. 77 FR 25168 - Appraisal Subcommittee (ASC); ASC Rules of Operation; Amended

    Science.gov (United States)

    2012-04-27

    ... heads of the Bureau of Consumer Financial Protection and the Federal Housing Finance Agency. The ASC Rules of Operation serve as corporate bylaws outlining the ASC's purpose, functions, authority... Title XI. The ASC Rules of Operation serve as corporate bylaws outlining the ASC's purpose, functions...

  13. Performance Measurement of Advanced Stirling Convertors (ASC-E3)

    Science.gov (United States)

    Oriti, Salvatore M.

    2013-01-01

    NASA Glenn Research Center (GRC) has been supporting development of the Advanced Stirling Radioisotope Generator (ASRG) since 2006. A key element of the ASRG project is providing life, reliability, and performance testing data of the Advanced Stirling Convertor (ASC). The latest version of the ASC (ASC-E3, to represent the third cycle of engineering model test hardware) is of a design identical to the forthcoming flight convertors. For this generation of hardware, a joint Sunpower and GRC effort was initiated to improve and standardize the test support hardware. After this effort was completed, the first pair of ASC-E3 units was produced by Sunpower and then delivered to GRC in December 2012. GRC has begun operation of these units. This process included performance verification, which examined the data from various tests to validate the convertor performance to the product specification. Other tests included detailed performance mapping that encompassed the wide range of operating conditions that will exist during a mission. These convertors were then transferred to Lockheed Martin for controller checkout testing. The results of this latest convertor performance verification activity are summarized here.

  14. 75 FR 80813 - Appraisal Subcommittee (ASC); ASC Rules of Operation; Amended

    Science.gov (United States)

    2010-12-23

    ... name and must rely on the General Services Administration (GSA) for the procurement of office space by... administration, procurement, and other services, consistent with directives of the ASC. In executing this... Conduct Section 11.01. Ethics Provision. The ASC members of the ASC and its officers and employees shall...

  15. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... at one site or multiple site licenses, and the format and media in which the software or... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software...

  16. Advanced Stirling Convertor (ASC) Technology Maturation

    Science.gov (United States)

    Wong, Wayne A.; Wilson, Scott; Collins, Josh; Wilson, Kyle

    2016-01-01

    The Advanced Stirling Convertor (ASC) development effort was initiated by NASA Glenn Research Center with contractor Sunpower, Inc., to develop high-efficiency thermal-to-electric power conversion technology for NASA Radioisotope Power Systems (RPSs). Early successful performance demonstrations led to the expansion of the project as well as adoption of the technology by the Department of Energy (DOE) and system integration contractor Lockheed Martin Space Systems Company as part of the Advanced Stirling Radioisotope Generator (ASRG) flight project. The ASRG integrates a pair of ASCs to convert the heat from a pair of General Purpose Heat Source (GPHS) modules into electrical power. The expanded NASA ASC effort included development of several generations of ASC prototypes or engineering units to help prepare the ASC technology and Sunpower for flight implementation. Sunpower later had two parallel contracts allowing the last of the NASA engineering units called ASC-E3 to serve as pathfinders for the ASC-F flight convertors being built for DOE. The ASC-E3 convertors utilized the ASC-F flight specifications and were built using the ASC-F design and process documentation. Shortly after the first ASC-F pair achieved initial operation, due to budget constraints, the DOE ASRG flight development contract was terminated. NASA continues to invest in the development of Stirling RPS technology including continued production of the ASC-E3 convertors, seven of which have been delivered with one additional unit in production. Starting in fiscal year 2015, Stirling Convertor Technology Maturation has been reorganized as an element of the RPS Stirling Cycle Technology Development (SCTD) Project and long-term plans for continued Stirling technology advancement are in reformulation. This paper provides a status on the ASC project, an overview of advancements made in the design and production of the ASC at Sunpower, and a summary of acceptance tests, reliability tests, and tactical

  17. New ATLAS Software & Computing Organization

    CERN Multimedia

    Barberis, D

    Following the election by the ATLAS Collaboration Board of Dario Barberis (Genoa University/INFN) as Computing Coordinator and David Quarrie (LBNL) as Software Project Leader, it was considered necessary to modify the organization of the ATLAS Software & Computing ("S&C") project. The new organization is based upon the following principles: separation of the responsibilities for computing management from those of software development, with the appointment of a Computing Coordinator and a Software Project Leader who are both members of the Executive Board; hierarchical structure of responsibilities and reporting lines; coordination at all levels between TDAQ, S&C and Physics working groups; integration of the subdetector software development groups with the central S&C organization. A schematic diagram of the new organization can be seen in Fig.1. Figure 1: new ATLAS Software & Computing organization. Two Management Boards will help the Computing Coordinator and the Software Project...

  18. Computer Software Reviews.

    Science.gov (United States)

    Hawaii State Dept. of Education, Honolulu. Office of Instructional Services.

    Intended to provide guidance in the selection of the best computer software available to support instruction and to make optimal use of schools' financial resources, this publication provides a listing of computer software programs that have been evaluated according to their currency, relevance, and value to Hawaii's educational programs. The…

  19. 48 CFR 12.212 - Computer software.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Computer software. 12.212... software. (a) Commercial computer software or commercial computer software documentation shall be acquired... required to— (1) Furnish technical information related to commercial computer software or commercial...

  20. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Science.gov (United States)

    2010-10-01

    ... operation of the software to display a restrictive rights legend or other license notice; and (2) Requires a... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and...

  1. NNSA?s Computing Strategy, Acquisition Plan, and Basis for Computing Time Allocation

    Energy Technology Data Exchange (ETDEWEB)

    Nikkel, D J

    2009-07-21

    This report is in response to the Omnibus Appropriations Act, 2009 (H.R. 1105; Public Law 111-8) in its funding of the National Nuclear Security Administration's (NNSA) Advanced Simulation and Computing (ASC) Program. This bill called for a report on ASC's plans for computing and platform acquisition strategy in support of stockpile stewardship. Computer simulation is essential to the stewardship of the nation's nuclear stockpile. Annual certification of the country's stockpile systems, Significant Finding Investigations (SFIs), and execution of Life Extension Programs (LEPs) are dependent on simulations employing the advanced ASC tools developed over the past decade plus; indeed, without these tools, certification would not be possible without a return to nuclear testing. ASC is an integrated program involving investments in computer hardware (platforms and computing centers), software environments, integrated design codes and physical models for these codes, and validation methodologies. The significant progress ASC has made in the past derives from its focus on mission and from its strategy of balancing support across the key investment areas necessary for success. All these investment areas must be sustained for ASC to adequately support current stockpile stewardship mission needs and to meet ever more difficult challenges as the weapons continue to age or undergo refurbishment. The appropriations bill called for this report to address three specific issues, which are responded to briefly here but are expanded upon in the subsequent document: (1) Identify how computing capability at each of the labs will specifically contribute to stockpile stewardship goals, and on what basis computing time will be allocated to achieve the goal of a balanced program among the labs. (2) Explain the NNSA's acquisition strategy for capacity and capability of machines at each of the labs and how it will fit within the existing budget constraints. (3

  2. 48 CFR 252.227-7014 - Rights in noncommercial computer software and noncommercial computer software documentation.

    Science.gov (United States)

    2010-10-01

    ...) Restricted rights in computer software, limited rights in technical data, or government purpose license... necessary to perfect a license or licenses in the deliverable software or documentation of the appropriate... the license rights obtained. (e) Identification and delivery of computer software and computer...

  3. 48 CFR 227.7203-14 - Conformity, acceptance, and warranty of computer software and computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Conformity, acceptance... Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...) Conformity and acceptance. Solicitations and contracts requiring the delivery of computer software shall...

  4. Computer games and software engineering

    CERN Document Server

    Cooper, Kendra M L

    2015-01-01

    Computer games represent a significant software application domain for innovative research in software engineering techniques and technologies. Game developers, whether focusing on entertainment-market opportunities or game-based applications in non-entertainment domains, thus share a common interest with software engineers and developers on how to best engineer game software.Featuring contributions from leading experts in software engineering, the book provides a comprehensive introduction to computer game software development that includes its history as well as emerging research on the inte

  5. Advanced Stirling Convertor (ASC) Development for NASA RPS

    Science.gov (United States)

    Wong, Wayne A.; Wilson, Scott; Collins, Josh

    2014-01-01

    Sunpower's Advanced Stirling Convertor (ASC) initiated development under contract to the NASA Glenn Research Center (GRC) and after a series of successful demonstrations, the ASC began transitioning from a technology development project to flight development project. The ASC has very high power conversion efficiency making it attractive for future Radioisotope Power Systems (RPS) in order to make best use of the low plutonium-238 fuel inventory in the U.S. In recent years, the ASC became part of the NASA-Department of Energy Advanced Stirling Radioisotope Generator (ASRG) Integrated Project. Sunpower held two parallel contracts to produce ASC convertors, one with the Department of Energy/Lockheed Martin to produce the ASC-F flight convertors, and one with NASA GRC for the production of ASC-E3 engineering units, the initial units of which served as production pathfinders. The integrated ASC technical team successfully overcame various technical challenges that led to the completion and delivery of the first two pairs of flight-like ASC-E3 by 2013. However, in late Fall 2013, the DOE initiated termination of the Lockheed Martin ASRG flight development contract driven primarily by budget constraints. NASA continues to recognize the importance of high efficiency ASC power conversion for RPS and continues investment in the technology including the continuation of ASC-E3 production at Sunpower and the assembly of the ASRG Engineering Unit #2. This paper provides a summary of ASC technical accomplishments, overview of tests at GRC, plans for continued ASC production at Sunpower, and status of Stirling technology development.

  6. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  7. 48 CFR 212.212 - Computer software.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other non...

  8. Upgrade Software and Computing

    CERN Document Server

    The LHCb Collaboration, CERN

    2018-01-01

    This document reports the Research and Development activities that are carried out in the software and computing domains in view of the upgrade of the LHCb experiment. The implementation of a full software trigger implies major changes in the core software framework, in the event data model, and in the reconstruction algorithms. The increase of the data volumes for both real and simulated datasets requires a corresponding scaling of the distributed computing infrastructure. An implementation plan in both domains is presented, together with a risk assessment analysis.

  9. ASC deficiency suppresses proliferation and prevents medulloblastoma incidence.

    Science.gov (United States)

    Knight, E R W; Patel, E Y; Flowers, C A; Crowther, A J; Ting, J P; Miller, C R; Gershon, T R; Deshmukh, M

    2015-01-15

    Apoptosis-associated speck-like protein containing a caspase recruitment domain (ASC) is silenced by promoter methylation in many types of tumors, yet ASC's role in most cancers remains unknown. Here, we show that ASC is highly expressed in a model of medulloblastoma, the most common malignant pediatric brain cancer; ASC is also expressed in human medulloblastomas. Importantly, while ASC deficiency did not affect normal cerebellar development, ASC knockout mice on the Smoothened (ND2:SmoA1) transgenic model of medulloblastoma exhibited a profound reduction in medulloblastoma incidence and a delayed tumor onset. A similar decrease in tumorigenesis with ASC deficiency was also seen in the hGFAP-Cre:SmoM2 mouse model of medulloblastoma. Interestingly, hyperproliferation of the external granule layer (EGL) was comparable at P20 in both wild-type and ASC-deficient SmoA1 mice. However, while the apoptosis and differentiation markers remained unchanged at this age, proliferation makers were decreased, and the EGL was reduced in thickness and area by P60. This reduction in proliferation with ASC deficiency was also seen in isolated SmoA1 cerebellar granule precursor cells in vitro, indicating that the effect of ASC deletion on proliferation was cell autonomous. Interestingly, ASC-deficient SmoA1 cerebella exhibited disrupted expression of genes in the transforming growth factor-β pathway and increased level of nuclear Smad3. Taken together, these results demonstrate an unexpected role for ASC in Sonic hedgehog-driven medulloblastoma tumorigenesis, thus identifying ASC as a promising novel target for antitumor therapy.

  10. Trends in computer hardware and software.

    Science.gov (United States)

    Frankenfeld, F M

    1993-04-01

    Previously identified and current trends in the development of computer systems and in the use of computers for health care applications are reviewed. Trends identified in a 1982 article were increasing miniaturization and archival ability, increasing software costs, increasing software independence, user empowerment through new software technologies, shorter computer-system life cycles, and more rapid development and support of pharmaceutical services. Most of these trends continue today. Current trends in hardware and software include the increasing use of reduced instruction-set computing, migration to the UNIX operating system, the development of large software libraries, microprocessor-based smart terminals that allow remote validation of data, speech synthesis and recognition, application generators, fourth-generation languages, computer-aided software engineering, object-oriented technologies, and artificial intelligence. Current trends specific to pharmacy and hospitals are the withdrawal of vendors of hospital information systems from the pharmacy market, improved linkage of information systems within hospitals, and increased regulation by government. The computer industry and its products continue to undergo dynamic change. Software development continues to lag behind hardware, and its high cost is offsetting the savings provided by hardware.

  11. Authenticated Secure Container System (ASCS)

    International Nuclear Information System (INIS)

    1991-01-01

    Sandia National Laboratories developed an Authenticated Secure Container System (ASCS) for the International Atomic Energy Agency (IAEA). Agency standard weights and safeguards samples can be stored in the ASCS to provide continuity of knowledge. The ASCS consists of an optically clear cover, a base containing the Authenticated Item Monitoring System (AIMS) transmitter, and the AIMS receiver unit for data collection. The ASCS will provide the Inspector with information concerning the status of the system, during a surveillance period, such as state of health, tampering attempts, and movement of the container system. The secure container is located inside a Glove Box with the receiver located remotely from the Glove Box. AIMS technology uses rf transmission from the secure container to the receiver to provide a record of state of health and tampering. The data is stored in the receiver for analysis by the Inspector during a future inspection visit. 2 refs

  12. The ANS mathematics and computation software standards

    Energy Technology Data Exchange (ETDEWEB)

    Smetana, A. O. [Savannah River National Laboratory, Washington Savannah River Company, Aiken, SC 29808 (United States)

    2006-07-01

    The Mathematics and Computations Div. of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains three ANSI/ANS software standards. These standards are: Portability of Scientific and Engineering Software, ANS-10.2; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Accommodating User Needs in Scientific and Engineering Computer Software Development, ANS-10.5. A fourth Standard, Documentation of Computer Software, ANS-10.3, is available as a historical Standard. (authors)

  13. The ANS mathematics and computation software standards

    International Nuclear Information System (INIS)

    Smetana, A. O.

    2006-01-01

    The Mathematics and Computations Div. of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains three ANSI/ANS software standards. These standards are: Portability of Scientific and Engineering Software, ANS-10.2; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Accommodating User Needs in Scientific and Engineering Computer Software Development, ANS-10.5. A fourth Standard, Documentation of Computer Software, ANS-10.3, is available as a historical Standard. (authors)

  14. Music Learning Based on Computer Software

    Directory of Open Access Journals (Sweden)

    Baihui Yan

    2017-12-01

    Full Text Available In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teachers have not found a reasonable countermeasure to them. Against this background, the introduction of computer music software to music learning is a new trial that can not only cultivate the students’ initiatives of music learning, but also enhance their abilities to learn music. Therefore, it is concluded that the computer software based music learning is of great significance to improving the current music learning modes and means.

  15. Ambulatory Surgical Center (ASC) Payment System

    Data.gov (United States)

    U.S. Department of Health & Human Services — This file contains a summary of service utilization by ASC supplier and is derived from 2011 ASC line item level data, updated through June 2012, that is, line items...

  16. Collection Of Software For Computer Graphics

    Science.gov (United States)

    Hibbard, Eric A.; Makatura, George

    1990-01-01

    Ames Research Graphics System (ARCGRAPH) collection of software libraries and software utilities assisting researchers in generating, manipulating, and visualizing graphical data. Defines metafile format containing device-independent graphical data. File format used with various computer-graphics-manipulation and -animation software packages at Ames, including SURF (COSMIC Program ARC-12381) and GAS (COSMIC Program ARC-12379). Consists of two-stage "pipeline" used to put out graphical primitives. ARCGRAPH libraries developed on VAX computer running VMS.

  17. Computer systems and software engineering

    Science.gov (United States)

    Mckay, Charles W.

    1988-01-01

    The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.

  18. The development of the brake system of the BMW 850i including ABS and ASC. Entwicklung des Bremssystems des BMW 850i einschliesslich ABS und ASC

    Energy Technology Data Exchange (ETDEWEB)

    Kraft, H.J.; Leffler, H.

    1990-02-01

    The brake system of the new BMW 850i is described in the following. The brake actuation takes place via an hydraulic brake booster. The disc brakes at front and rear axle are arranged in diagnonal brake split. The 4-channel ABS is fitted as standard equipment. The ABS control unit also incorporates the algorithm for the Automatic Stability Control System ASC or ASC+T. The ASC+T shows improved traction compared with the pure stability system ASC and is standard in the BMW 850i with manual gear box. The automatic gear box equipped BMW 850i are supplied with ASC, the ASC+T is available as an option. Both systems, the ASC and the ASC+T are described with special view on the electronical and hydraulical network in the car. A performance comparison of the ASC-systems completes the description. (orig.).

  19. 14 CFR 415.123 - Computing systems and software.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Computing systems and software. 415.123... Launch Vehicle From a Non-Federal Launch Site § 415.123 Computing systems and software. (a) An applicant's safety review document must describe all computing systems and software that perform a safety...

  20. Advanced Stirling Convertor (ASC) Technology Maturation in Preparation for Flight

    Science.gov (United States)

    Wong, Wayne A.; Cornell, Peggy A.

    2012-01-01

    The Advanced Stirling Convertor (ASC) is being developed by an integrated team of Sunpower and National Aeronautics and Space Administration s (NASA s) Glenn Research Center (GRC). The ASC development, funded by NASA s Science Mission Directorate, started as a technology development effort in 2003 and has since evolved through progressive convertor builds and successful testing to demonstrate high conversion efficiency, low mass, and capability to meet long-life Radioisotope Power System (RPS) requirements. The technology has been adopted by the Department of Energy and Lockheed Martin Space Systems Company s Advanced Stirling Radioisotope Generator (ASRG), which has been selected for potential flight demonstration on Discovery 12. This paper provides an overview of the status of ASC development including the most recent ASC-E2 convertors that have been delivered to GRC and an introduction to the ASC-E3 and ASC flight convertors that Sunpower will build next. The paper also describes the technology maturation and support tasks being conducted at GRC to support ASC and ASRG development in the areas of convertor and generator extended operation, high-temperature materials, heater head life assessment, organics, nondestructive inspection, spring fatigue testing, and other reliability verification tasks.

  1. Music Learning Based on Computer Software

    OpenAIRE

    Baihui Yan; Qiao Zhou

    2017-01-01

    In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teach...

  2. Software and Systems Test Track Architecture and Concept Definition

    Science.gov (United States)

    2007-05-01

    Light 11.0 11.0 11.0 ASC Flex Free Software Foundation 2.5.31 2.5.31 2.5.31 ASC Fluent Fluent Inc. 6.2.26 6.2.26 6.2.26 6.2.26 ASC FMD ...11 ERDC Fluent Fluent 6.2.16 ERDC Fortran 77/90 compiler Compaq/Cray/SGI 7.4 7.4.3m 7.4.4m 5.6 ERDC FTA Platform 1.1 1.1 1.1 ERDC GAMESS

  3. Computer software review procedures

    International Nuclear Information System (INIS)

    Mauck, J.L.

    1993-01-01

    This article reviews the procedures which are used to review software written for computer based instrumentation and control functions in nuclear facilities. The utilization of computer based control systems is becoming much more prevalent in such installations, in addition to being retrofit into existing systems. Currently, the Nuclear Regulatory System uses Regulatory Guide 1.152, open-quotes Criteria for Programmable Digital Computer System Software in Safety-Related Systems of Nuclear Power Plantsclose quotes and ANSI/IEEE-ANS-7-4.3.2-1982, open-quotes Application Criteria for Programmable Digital Computer Systems in Safety Systems of Nuclear Power Generating Stationsclose quotes for guidance when performing reviews of digital systems. There is great concern about the process of verification and validation of these codes, so when inspections are done of such systems, inspectors examine very closely the processes which were followed in developing the codes, the errors which were detected, how they were found, and the analysis which went into tracing down the causes behind the errors to insure such errors were not propagated again in the future

  4. Advanced Stirling Convertor (ASC-E2) Characterization Testing

    Science.gov (United States)

    Williams, Zachary D.; Oriti, Salvatore M.

    2012-01-01

    Testing has been conducted on Advanced Stirling Convertors (ASCs)-E2 at NASA Glenn Research Center in support of the Advanced Stirling Radioisotope Generator (ASRG) project. This testing has been conducted to understand sensitivities of convertor parameters due to environmental and operational changes during operation of the ASRG in missions to space. This paper summarizes test results and explains the operation of the ASRG during space missions

  5. Software For Computing Selected Functions

    Science.gov (United States)

    Grant, David C.

    1992-01-01

    Technical memorandum presents collection of software packages in Ada implementing mathematical functions used in science and engineering. Provides programmer with function support in Pascal and FORTRAN, plus support for extended-precision arithmetic and complex arithmetic. Valuable for testing new computers, writing computer code, or developing new computer integrated circuits.

  6. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Science.gov (United States)

    2012-08-22

    ... Computer Software and Complex Electronics Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear...-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used... Electronics Engineers (ANSI/IEEE) Standard 830-1998, ``IEEE Recommended Practice for Software Requirements...

  7. 48 CFR 27.405-3 - Commercial computer software.

    Science.gov (United States)

    2010-10-01

    ... software. 27.405-3 Section 27.405-3 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... Commercial computer software. (a) When contracting other than from GSA's Multiple Award Schedule contracts for the acquisition of commercial computer software, no specific contract clause prescribed in this...

  8. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    Science.gov (United States)

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  9. Active resources concept of computation for enterprise software

    Directory of Open Access Journals (Sweden)

    Koryl Maciej

    2017-06-01

    Full Text Available Traditional computational models for enterprise software are still to a great extent centralized. However, rapid growing of modern computation techniques and frameworks causes that contemporary software becomes more and more distributed. Towards development of new complete and coherent solution for distributed enterprise software construction, synthesis of three well-grounded concepts is proposed: Domain-Driven Design technique of software engineering, REST architectural style and actor model of computation. As a result new resources-based framework arises, which after first cases of use seems to be useful and worthy of further research.

  10. Computer, Network, Software, and Hardware Engineering with Applications

    CERN Document Server

    Schneidewind, Norman F

    2012-01-01

    There are many books on computers, networks, and software engineering but none that integrate the three with applications. Integration is important because, increasingly, software dominates the performance, reliability, maintainability, and availability of complex computer and systems. Books on software engineering typically portray software as if it exists in a vacuum with no relationship to the wider system. This is wrong because a system is more than software. It is comprised of people, organizations, processes, hardware, and software. All of these components must be considered in an integr

  11. Computer-assisted qualitative data analysis software.

    Science.gov (United States)

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  12. Contracting for Computer Software in Standardized Computer Languages

    Science.gov (United States)

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the contracting process for standard language software. Appropriate contract language is suggested for requiring strict compliance with a standard, and an overview of remedies is given for failure to comply.

  13. 48 CFR 212.7003 - Technical data and computer software.

    Science.gov (United States)

    2010-10-01

    ... computer software. 212.7003 Section 212.7003 Federal Acquisition Regulations System DEFENSE ACQUISITION... data and computer software. For purposes of establishing delivery requirements and license rights for technical data under 227.7102 and for computer software under 227.7202, there shall be a rebuttable...

  14. ASC-PROBA Interface Control Document

    DEFF Research Database (Denmark)

    Betto, Maurizio; Jørgensen, John Leif; Jørgensen, Finn E

    1999-01-01

    of Automation of the Technical University of Denmark. The document is structured as follows. First we present the ASC - heritage, system description, performance - then we address more specifically the environmental properties, like the EMC compatibility and thermal characteristics, and the design...... and reliability issues. Section 6 deals with the testing and the calibration procedures and in section 7 the mechanical and electrical interfaces are given. In section 8 and 9 we address issues like manufacturing, transportation and storage and to conclude we review the ASC specifications against the PROBA...

  15. Software Accelerates Computing Time for Complex Math

    Science.gov (United States)

    2014-01-01

    Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.

  16. Software engineering frameworks for the cloud computing paradigm

    CERN Document Server

    Mahmood, Zaigham

    2013-01-01

    This book presents the latest research on Software Engineering Frameworks for the Cloud Computing Paradigm, drawn from an international selection of researchers and practitioners. The book offers both a discussion of relevant software engineering approaches and practical guidance on enterprise-wide software deployment in the cloud environment, together with real-world case studies. Features: presents the state of the art in software engineering approaches for developing cloud-suitable applications; discusses the impact of the cloud computing paradigm on software engineering; offers guidance an

  17. ASC Tri-lab Co-design Level 2 Milestone Report 2015

    Energy Technology Data Exchange (ETDEWEB)

    Hornung, Rich [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Jones, Holger [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Keasler, Jeff [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Neely, Rob [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pearce, Olga [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hammond, Si [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Trott, Christian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lin, Paul [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vaughan, Courtenay [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cook, Jeanine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoekstra, Rob [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bergen, Ben [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Payne, Josh [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Womeldorff, Geoff [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-09-23

    In 2015, the three Department of Energy (DOE) National Laboratories that make up the Advanced Sci- enti c Computing (ASC) Program (Sandia, Lawrence Livermore, and Los Alamos) collaboratively explored performance portability programming environments in the context of several ASC co-design proxy applica- tions as part of a tri-lab L2 milestone executed by the co-design teams at each laboratory. The programming environments that were studied included Kokkos (developed at Sandia), RAJA (LLNL), and Legion (Stan- ford University). The proxy apps studied included: miniAero, LULESH, CoMD, Kripke, and SNAP. These programming models and proxy-apps are described herein. Each lab focused on a particular combination of abstractions and proxy apps, with the goal of assessing performance portability using those. Performance portability was determined by: a) the ability to run a single application source code on multiple advanced architectures, b) comparing runtime performance between \

  18. Software Systems for High-performance Quantum Computing

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL; Britt, Keith A [ORNL

    2016-01-01

    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventional computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.

  19. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  20. Computer-Aided Software Engineering - An approach to real-time software development

    Science.gov (United States)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  1. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    Science.gov (United States)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  2. High Performance Computing Operations Review Report

    Energy Technology Data Exchange (ETDEWEB)

    Cupps, Kimberly C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-19

    The High Performance Computing Operations Review (HPCOR) meeting—requested by the ASC and ASCR program headquarters at DOE—was held November 5 and 6, 2013, at the Marriott Hotel in San Francisco, CA. The purpose of the review was to discuss the processes and practices for HPC integration and its related software and facilities. Experiences and lessons learned from the most recent systems deployed were covered in order to benefit the deployment of new systems.

  3. Computational intelligence and quantitative software engineering

    CERN Document Server

    Succi, Giancarlo; Sillitti, Alberto

    2016-01-01

    In a down-to-the earth manner, the volume lucidly presents how the fundamental concepts, methodology, and algorithms of Computational Intelligence are efficiently exploited in Software Engineering and opens up a novel and promising avenue of a comprehensive analysis and advanced design of software artifacts. It shows how the paradigm and the best practices of Computational Intelligence can be creatively explored to carry out comprehensive software requirement analysis, support design, testing, and maintenance. Software Engineering is an intensive knowledge-based endeavor of inherent human-centric nature, which profoundly relies on acquiring semiformal knowledge and then processing it to produce a running system. The knowledge spans a wide variety of artifacts, from requirements, captured in the interaction with customers, to design practices, testing, and code management strategies, which rely on the knowledge of the running system. This volume consists of contributions written by widely acknowledged experts ...

  4. 48 CFR 52.227-19 - Commercial Computer Software License.

    Science.gov (United States)

    2010-10-01

    ... Software License. 52.227-19 Section 52.227-19 Federal Acquisition Regulations System FEDERAL ACQUISITION... Clauses 52.227-19 Commercial Computer Software License. As prescribed in 27.409(g), insert the following clause: Commercial Computer Software License (DEC 2007) (a) Notwithstanding any contrary provisions...

  5. CMS Software and Computing Ready for Run 2

    CERN Document Server

    Bloom, Kenneth

    2015-01-01

    In Run 1 of the Large Hadron Collider, software and computing was a strategic strength of the Compact Muon Solenoid experiment. The timely processing of data and simulation samples and the excellent performance of the reconstruction algorithms played an important role in the preparation of the full suite of searches used for the observation of the Higgs boson in 2012. In Run 2, the LHC will run at higher intensities and CMS will record data at a higher trigger rate. These new running conditions will provide new challenges for the software and computing systems. Over the two years of Long Shutdown 1, CMS has built upon the successes of Run 1 to improve the software and computing to meet these challenges. In this presentation we will describe the new features in software and computing that will once again put CMS in a position of physics leadership.

  6. Evaluation of high-performance computing software

    Energy Technology Data Exchange (ETDEWEB)

    Browne, S.; Dongarra, J. [Univ. of Tennessee, Knoxville, TN (United States); Rowan, T. [Oak Ridge National Lab., TN (United States)

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  7. Test Hardware Design for Flightlike Operation of Advanced Stirling Convertors (ASC-E3)

    Science.gov (United States)

    Oriti, Salvatore M.

    2012-01-01

    NASA Glenn Research Center (GRC) has been supporting development of the Advanced Stirling Radioisotope Generator (ASRG) since 2006. A key element of the ASRG project is providing life, reliability, and performance testing of the Advanced Stirling Convertor (ASC). For this purpose, the Thermal Energy Conversion branch at GRC has been conducting extended operation of a multitude of free-piston Stirling convertors. The goal of this effort is to generate long-term performance data (tens of thousands of hours) simultaneously on multiple units to build a life and reliability database. The test hardware for operation of these convertors was designed to permit in-air investigative testing, such as performance mapping over a range of environmental conditions. With this, there was no requirement to accurately emulate the flight hardware. For the upcoming ASC-E3 units, the decision has been made to assemble the convertors into a flight-like configuration. This means the convertors will be arranged in the dual-opposed configuration in a housing that represents the fit, form, and thermal function of the ASRG. The goal of this effort is to enable system level tests that could not be performed with the traditional test hardware at GRC. This offers the opportunity to perform these system-level tests much earlier in the ASRG flight development, as they would normally not be performed until fabrication of the qualification unit. This paper discusses the requirements, process, and results of this flight-like hardware design activity.

  8. Computer organization and design the hardware/software interface

    CERN Document Server

    Hennessy, John L

    1994-01-01

    Computer Organization and Design: The Hardware/Software Interface presents the interaction between hardware and software at a variety of levels, which offers a framework for understanding the fundamentals of computing. This book focuses on the concepts that are the basis for computers.Organized into nine chapters, this book begins with an overview of the computer revolution. This text then explains the concepts and algorithms used in modern computer arithmetic. Other chapters consider the abstractions and concepts in memory hierarchies by starting with the simplest possible cache. This book di

  9. Open-Source Software in Computational Research: A Case Study

    Directory of Open Access Journals (Sweden)

    Sreekanth Pannala

    2008-04-01

    Full Text Available A case study of open-source (OS development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized in the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.

  10. Ethics in computer software design and development

    Science.gov (United States)

    Alan J. Thomson; Daniel L. Schmoldt

    2001-01-01

    Over the past 20 years, computer software has become integral and commonplace for operational and management tasks throughout agricultural and natural resource disciplines. During this software infusion, however, little thought has been afforded human impacts, both good and bad. This paper examines current ethical issues of software system design and development in...

  11. Atypical squamous cells, cannot exclude high grade squamous intraepithelial (ASC-H in HIV-positive women

    Directory of Open Access Journals (Sweden)

    Michelow Pam

    2010-01-01

    Full Text Available Objective: South Africa has very high rates of both HIV infection and cervical pathology. The management of ASC-H is colposcopy and directed biopsy, but with so many women diagnosed with HSIL and a dearth of colposcopy centres in South Africa, women with cytologic diagnosis of ASC-H may not be prioritized for colposcopy. The aim of this study was to determine if HIV-positive women with a cytologic diagnosis of ASC-H should undergo immediate colposcopy or whether colposcopy can be delayed, within the context of an underfunded health care setting with so many competing health needs. Materials and Methods: A computer database search was performed from the archives of an NGO-administered clinic that offers comprehensive HIV care. All women with a cytologic diagnosis of ASC-H on cervical smears from September 2005 until August 2009 were identified. Histologic follow up was sought in all patients. Results: A total of 2111 cervical smears were performed and 41 diagnosed as ASC-H (1.94%. No histologic follow up data was available in 15 cases. Follow up histologic results were as follows: three negative (11.5%, five koilocytosis and/ or CIN1 (19.2%, ten CIN2 (38.5% and eight CIN3 (30.8%. There were no cases of invasive carcinoma on follow up. Conclusion: The current appropriate management of HIV-positive women in low-resource settings with a diagnosis of ASC-H on cervical smear is colposcopy, despite the costs involved. In the future and if cost-effective in developing nations, use of novel markers may help select which HIV-positive women can be managed conservatively and which ones referred for more active treatment. More research in this regard is warranted.

  12. Practical methods to improve the development of computational software

    International Nuclear Information System (INIS)

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-01-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  13. Teaching cloud computing: a software engineering perspective

    OpenAIRE

    Sommerville, Ian

    2012-01-01

    This short papers discusses the issues of teaching cloud computing from a software engineering rather than a business perspective. It discusses what topics might be covered in a senior course on cloud software engineering.

  14. Immunocytoexpression profile of ProExC in smears interpreted as ASC-US, ASC-H, and cervical intraepithelial lesion

    Directory of Open Access Journals (Sweden)

    Zeynep Tosuner

    2017-01-01

    Full Text Available Aims: We aimed to investigate the immunocytoexpression profiles of a novel assay ProEx C for topoisomerase II alpha (TOP2A and minichromosome maintenance protein 2 (MCM2 in abnormal interpreted smears. Settings and Design: Screening programs with Papanicolaou smear and high risk group human papilloma virus testing have yielded a dramatic reduction of cervical cancer incidence. However, both of these tests have limited specificity for the detection of clinically significant cervical high grade lesions. ProEx C for topoisomerase II alpha (TOP2A and minichromosome maintenance protein 2 (MCM2 has been considered to have tight association with high grade intraepithelial lesions. Materials and Methods: A total number of 54 SurePath cervical cytology specimens of patients previously interpreted as atypical squamous cells-undetermined significance (ASC-US, atypical squamous cells-cannot exclude high grade squamous intraepithelial lesion (ASC-H, low grade squamous intraepithelial lesion (LSIL, and high grade squamous intraepithelial lesion (HSIL were included in our study. Results and Conclusions: ProEx C was positive in 14 of HSILs (100%, 3 of 19 LSILs (16%, 2 of 4 ASC-Hs, and none of ASC-USs (0%. The ProEx C test showed very intense nuclear staining in all cytologically abnormal cells. Further studies are indicated to evaluate the diagnostic role of ProEx C.

  15. The HEP Software and Computing Knowledge Base

    Science.gov (United States)

    Wenaus, T.

    2017-10-01

    HEP software today is a rich and diverse domain in itself and exists within the mushrooming world of open source software. As HEP software developers and users we can be more productive and effective if our work and our choices are informed by a good knowledge of what others in our community have created or found useful. The HEP Software and Computing Knowledge Base, hepsoftware.org, was created to facilitate this by serving as a collection point and information exchange on software projects and products, services, training, computing facilities, and relating them to the projects, experiments, organizations and science domains that offer them or use them. It was created as a contribution to the HEP Software Foundation, for which a HEP S&C knowledge base was a much requested early deliverable. This contribution will motivate and describe the system, what it offers, its content and contributions both existing and needed, and its implementation (node.js based web service and javascript client app) which has emphasized ease of use for both users and contributors.

  16. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    Science.gov (United States)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  17. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  18. Contracting for Computer Software in Standardized Computer Languages

    OpenAIRE

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the co...

  19. Honeywell Modular Automation System Computer Software Documentation

    International Nuclear Information System (INIS)

    STUBBS, A.M.

    2000-01-01

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP). This CSWD describes hardware and PFP developed software for control of stabilization furnaces. The Honeywell software can generate configuration reports for the developed control software. These reports are described in the following section and are attached as addendum's. This plan applies to PFP Engineering Manager, Thermal Stabilization Cognizant Engineers, and the Shift Technical Advisors responsible for the Honeywell MAS software/hardware and administration of the Honeywell System

  20. 78 FR 47011 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software... revised regulatory guide (RG), revision 1 of RG 1.171, ``Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This RG endorses American National Standards...

  1. 77 FR 50722 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2012-08-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software...) is issuing for public comment draft regulatory guide (DG), DG-1208, ``Software Unit Testing for Digital Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1208 is proposed...

  2. Comparison of computed tomography dose reporting software

    International Nuclear Information System (INIS)

    Abdullah, A.; Sun, Z.; Pongnapang, N.; Ng, K. H.

    2008-01-01

    Computed tomography (CT) dose reporting software facilitates the estimation of doses to patients undergoing CT examinations. In this study, comparison of three software packages, i.e. CT-Expo (version 1.5, Medizinische Hochschule, Hannover (Germany)), ImPACT CT Patients Dosimetry Calculator (version 0.99x, Imaging Performance Assessment on Computed Tomography, www.impactscan.org) and WinDose (version 2.1a, Wellhofer Dosimetry, Schwarzenbruck (Germany)), has been made in terms of their calculation algorithm and the results of calculated doses. Estimations were performed for head, chest, abdominal and pelvic examinations based on the protocols recommended by European guidelines using single-slice CT (SSCT) (Siemens Somatom Plus 4, Erlangen (Germany)) and multi-slice CT (MSCT) (Siemens Sensation 16, Erlangen (Germany)) for software-based female and male phantoms. The results showed that there are some differences in final dose reporting provided by these software packages. There are deviations of effective doses produced by these software packages. Percentages of coefficient of variance range from 3.3 to 23.4 % in SSCT and from 10.6 to 43.8 % in MSCT. It is important that researchers state the name of the software that is used to estimate the various CT dose quantities. Users must also understand the equivalent terminologies between the information obtained from the CT console and the software packages in order to use the software correctly. (authors)

  3. Seeding the cloud: Financial bootstrapping in the computer software sector

    OpenAIRE

    Mac An Bhaird, Ciarán; Lynn, Theo

    2015-01-01

    This study investigates resourcing of computer software companies that have adopted cloud computing for the development and delivery of application software. Use of this innovative technology potentially impacts firm financing because the initial infrastructure investment requirement is much lower than for packaged software, lead time to market is shorter, and cloud computing supports instant scalability. We test these predictions by conducting in-depth interviews with founders of 18 independ...

  4. Architecture independent environment for developing engineering software on MIMD computers

    Science.gov (United States)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  5. Hardware and software maintenance strategies for upgrading vintage computers

    International Nuclear Information System (INIS)

    Wang, B.C.; Buijs, W.J.; Banting, R.D.

    1992-01-01

    The paper focuses on the maintenance of the computer hardware and software for digital control computers (DCC). Specific design and problems related to various maintenance strategies are reviewed. A foundation was required for a reliable computer maintenance and upgrading program to provide operation of the DCC with high availability and reliability for 40 years. This involved a carefully planned and executed maintenance and upgrading program, involving complementary hardware and software strategies. The computer system was designed on a modular basis, with large sections easily replaceable, to facilitate maintenance and improve availability of the system. Advances in computer hardware have made it possible to replace DCC peripheral devices with reliable, inexpensive, and widely available components from PC-based systems (PC = personal computer). By providing a high speed link from the DCC to a PC, it is now possible to use many commercial software packages to process data from the plant. 1 fig

  6. TMS communications software. Volume 1: Computer interfaces

    Science.gov (United States)

    Brown, J. S.; Lenker, M. D.

    1979-01-01

    A prototype bus communications system, which is being used to support the Trend Monitoring System (TMS) as well as for evaluation of the bus concept is considered. Hardware and software interfaces to the MODCOMP and NOVA minicomputers are included. The system software required to drive the interfaces in each TMS computer is described. Documentation of other software for bus statistics monitoring and for transferring files across the bus is also included.

  7. 14 CFR 417.123 - Computing systems and software.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Computing systems and software. 417.123... systems and software. (a) A launch operator must document a system safety process that identifies the... systems and software. (b) A launch operator must identify all safety-critical functions associated with...

  8. High Performance Computing Software Applications for Space Situational Awareness

    Science.gov (United States)

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  9. Imprinting Community College Computer Science Education with Software Engineering Principles

    Science.gov (United States)

    Hundley, Jacqueline Holliday

    Although the two-year curriculum guide includes coverage of all eight software engineering core topics, the computer science courses taught in Alabama community colleges limit student exposure to the programming, or coding, phase of the software development lifecycle and offer little experience in requirements analysis, design, testing, and maintenance. We proposed that some software engineering principles can be incorporated into the introductory-level of the computer science curriculum. Our vision is to give community college students a broader exposure to the software development lifecycle. For those students who plan to transfer to a baccalaureate program subsequent to their community college education, our vision is to prepare them sufficiently to move seamlessly into mainstream computer science and software engineering degrees. For those students who plan to move from the community college to a programming career, our vision is to equip them with the foundational knowledge and skills required by the software industry. To accomplish our goals, we developed curriculum modules for teaching seven of the software engineering knowledge areas within current computer science introductory-level courses. Each module was designed to be self-supported with suggested learning objectives, teaching outline, software tool support, teaching activities, and other material to assist the instructor in using it.

  10. Revision of ASCE 4

    International Nuclear Information System (INIS)

    Nelson, T.A.; Murray, R.C.; Short, S.A.

    1995-01-01

    The original version of ASCE Standard 4, ''Seismic Analysis of Safety-Related Nuclear Structures'' was published in September 1986. It is ASCE policy to update its standards on a five year interval and the Working Group on Seismic Analysis of Safety Related Nuclear Structures was reconvened to formulate the revisions. The goal in updating the standard is to make sure that it is still relevant and that it incorporates the state of the practice in seismic engineering or, in some cases, where it has been demonstrated that state-of-the-art improvements need to be made to standard practice; new improvements are included. The contents of the new standard cover the same areas as the original version, with some additions. The contents are as follows: Input - response spectra and time histories; modeling of structures; analysis of structures; soil-structure interaction; input for subsystem analysis; special structures - buried pipes and conduits, earth-retaining walls, above-ground vertical tanks, raceways, and base-isolated structures; and an appendix providing seismic probabilistic risk assessment and margin assessment

  11. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Phillips, Julia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wampler, Cheryl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meisner, Robert [National Nuclear Security Administration (NNSA), Washington, DC (United States)

    2010-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality, and scientific details); to quantify critical margins and uncertainties; and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  12. 78 FR 47015 - Software Requirement Specifications for Digital Computer Software Used in Safety Systems of...

    Science.gov (United States)

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Requirement Specifications for Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission... issuing a revised regulatory guide (RG), revision 1 of RG 1.172, ``Software Requirement Specifications for...

  13. Assessment of Computer Software Usage for Estimating and Tender ...

    African Journals Online (AJOL)

    It has been discovered that there are limitations to the use of computer software packages in construction operations especially estimating and tender analysis. The objectives of this research is to evaluate the level of computer software usage for estimating and tender analysis while also assessing the challenges faced by ...

  14. 48 CFR 252.227-7027 - Deferred ordering of technical data or computer software.

    Science.gov (United States)

    2010-10-01

    ... technical data or computer software. 252.227-7027 Section 252.227-7027 Federal Acquisition Regulations... data or computer software. As prescribed at 227.7103-8(b), use the following clause: Deferred Ordering of Technical Data or Computer Software (APR 1988) In addition to technical data or computer software...

  15. An integrated computer design environment for the development of micro-computer critical software

    International Nuclear Information System (INIS)

    De Agostino, E.; Massari, V.

    1986-01-01

    The paper deals with the development of micro-computer software for Nuclear Safety System. More specifically, it describes an experimental work in the field of software development methodologies to be used for the implementation of micro-computer based safety systems. An investigation of technological improvements that are provided by state-of-the-art integrated packages for micro-based systems development has been carried out. The work has aimed to assess a suitable automated tools environment for the whole software life-cycle. The main safety functions, as DNBR, KW/FT, of a nuclear power reactor have been implemented in a host-target approach. A prototype test-bed microsystem has been implemented to run the safety functions in order to derive a concrete evaluation on the feasibility of critical software according to new technological trends of ''Software Factories''. (author)

  16. Bringing Legacy Visualization Software to Modern Computing Devices via Application Streaming

    Science.gov (United States)

    Fisher, Ward

    2014-05-01

    Planning software compatibility across forthcoming generations of computing platforms is a problem commonly encountered in software engineering and development. While this problem can affect any class of software, data analysis and visualization programs are particularly vulnerable. This is due in part to their inherent dependency on specialized hardware and computing environments. A number of strategies and tools have been designed to aid software engineers with this task. While generally embraced by developers at 'traditional' software companies, these methodologies are often dismissed by the scientific software community as unwieldy, inefficient and unnecessary. As a result, many important and storied scientific software packages can struggle to adapt to a new computing environment; for example, one in which much work is carried out on sub-laptop devices (such as tablets and smartphones). Rewriting these packages for a new platform often requires significant investment in terms of development time and developer expertise. In many cases, porting older software to modern devices is neither practical nor possible. As a result, replacement software must be developed from scratch, wasting resources better spent on other projects. Enabled largely by the rapid rise and adoption of cloud computing platforms, 'Application Streaming' technologies allow legacy visualization and analysis software to be operated wholly from a client device (be it laptop, tablet or smartphone) while retaining full functionality and interactivity. It mitigates much of the developer effort required by other more traditional methods while simultaneously reducing the time it takes to bring the software to a new platform. This work will provide an overview of Application Streaming and how it compares against other technologies which allow scientific visualization software to be executed from a remote computer. We will discuss the functionality and limitations of existing application streaming

  17. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  18. CMS software and computing for LHC Run 2

    CERN Document Server

    INSPIRE-00067576

    2016-11-09

    The CMS offline software and computing system has successfully met the challenge of LHC Run 2. In this presentation, we will discuss how the entire system was improved in anticipation of increased trigger output rate, increased rate of pileup interactions and the evolution of computing technology. The primary goals behind these changes was to increase the flexibility of computing facilities where ever possible, as to increase our operational efficiency, and to decrease the computing resources needed to accomplish the primary offline computing workflows. These changes have resulted in a new approach to distributed computing in CMS for Run 2 and for the future as the LHC luminosity should continue to increase. We will discuss changes and plans to our data federation, which was one of the key changes towards a more flexible computing model for Run 2. Our software framework and algorithms also underwent significant changes. We will summarize the our experience with a new multi-threaded framework as deployed on ou...

  19. 48 CFR 27.404-2 - Limited rights data and restricted computer software.

    Science.gov (United States)

    2010-10-01

    ... restricted computer software. 27.404-2 Section 27.404-2 Federal Acquisition Regulations System FEDERAL... Copyrights 27.404-2 Limited rights data and restricted computer software. (a) General. The basic clause at 52... restricted computer software by withholding the data from the Government and instead delivering form, fit...

  20. 48 CFR 1852.227-86 - Commercial computer software-Licensing.

    Science.gov (United States)

    2010-10-01

    .../contractor proposes its standard commercial software license, those applicable portions thereof consistent... its standard commercial software license until after this purchase order/contract has been issued, or at or after the time the computer software is delivered, such license shall nevertheless be deemed...

  1. Differential splicing of the apoptosis-associated speck like protein containing a caspase recruitment domain (ASC regulates inflammasomes

    Directory of Open Access Journals (Sweden)

    Rojanasakul Yon

    2010-05-01

    Full Text Available Abstract Background The apoptotic speck-like protein containing a caspase recruitment domain (ASC is the essential adaptor protein for caspase 1 mediated interleukin (IL-1β and IL-18 processing in inflammasomes. It bridges activated Nod like receptors (NLRs, which are a family of cytosolic pattern recognition receptors of the innate immune system, with caspase 1, resulting in caspase 1 activation and subsequent processing of caspase 1 substrates. Hence, macrophages from ASC deficient mice are impaired in their ability to produce bioactive IL-1β. Furthermore, we recently showed that ASC translocates from the nucleus to the cytosol in response to inflammatory stimulation in order to promote an inflammasome response, which triggers IL-1β processing and secretion. However, the precise regulation of inflammasomes at the level of ASC is still not completely understood. In this study we identified and characterized three novel ASC isoforms for their ability to function as an inflammasome adaptor. Methods To establish the ability of ASC and ASC isoforms as functional inflammasome adaptors, IL-1β processing and secretion was investigated by ELISA in inflammasome reconstitution assays, stable expression in THP-1 and J774A1 cells, and by restoring the lack of endogenous ASC in mouse RAW264.7 macrophages. In addition, the localization of ASC and ASC isoforms was determined by immunofluorescence staining. Results The three novel ASC isoforms, ASC-b, ASC-c and ASC-d display unique and distinct capabilities to each other and to full length ASC in respect to their function as an inflammasome adaptor, with one of the isoforms even showing an inhibitory effect. Consistently, only the activating isoforms of ASC, ASC and ASC-b, co-localized with NLRP3 and caspase 1, while the inhibitory isoform ASC-c, co-localized only with caspase 1, but not with NLRP3. ASC-d did not co-localize with NLRP3 or with caspase 1 and consistently lacked the ability to function as an

  2. Network protocol changes can improve DisCom WAN performance : evaluating TCP modifications and SCTP in the ASC tri-lab environment.

    Energy Technology Data Exchange (ETDEWEB)

    Tolendino, Lawrence F.; Hu, Tan Chang

    2005-06-01

    The Advanced Simulation and Computing (ASC) Distance Computing (DisCom) Wide Area Network (WAN) is a high performance, long distance network environment that is based on the ubiquitous TCP/IP protocol set. However, the Transmission Control Protocol (TCP) and the algorithms that govern its operation were defined almost two decades ago for a network environment vastly different from the DisCom WAN. In this paper we explore and evaluate possible modifications to TCP that purport to improve TCP performance in environments like the DisCom WAN. We also examine a much newer protocol, SCTP (Stream Control Transmission Protocol) that claims to provide reliable network transport while also implementing multi-streaming, multi-homing capabilities that are appealing in the DisCom high performance network environment. We provide performance comparisons and recommendations for continued development that will lead to network communications protocol implementations capable of supporting the coming ASC Petaflop computing environments.

  3. Bioconductor: open software development for computational biology and bioinformatics

    DEFF Research Database (Denmark)

    Gentleman, R.C.; Carey, V.J.; Bates, D.M.

    2004-01-01

    The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. The goals of the project include: fostering collaborative development and widespread use of innovative software, reducing barriers to entry into interdisci......The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. The goals of the project include: fostering collaborative development and widespread use of innovative software, reducing barriers to entry...... into interdisciplinary scientific research, and promoting the achievement of remote reproducibility of research results. We describe details of our aims and methods, identify current challenges, compare Bioconductor to other open bioinformatics projects, and provide working examples....

  4. A briefing to verification and validation of computer software

    International Nuclear Information System (INIS)

    Zhang Aisen; Xie Yalian

    2012-01-01

    Nowadays, the computer equipment and information processing technology is coming into the engineering of instrument and process control. Owing to its convenient and other advantages, more and more utilities are more than happy to use it. After initial utilization in basic functional controlling, the computer equipment and information processing technology is widely used in safety critical control. Consequently, the people pay more attentions to the quality of computer software. How to assess and ensure its quality are the most concerned problems. The verification and validation technology of computer software are important steps to the quality assurance. (authors)

  5. Computers and Young Children. Storyboard Software: Flannel Boards in the Computer Age.

    Science.gov (United States)

    Shade, Daniel D.

    1995-01-01

    Describes storyboard software as computer programs with which children can build a story using visuals. Notes the importance of such programs from preliterate or nonreading children. Describes a new storyboard program, "Wiggins in Storyland," and its features. Lists recommended storyboard software programs, with publishers and compatible…

  6. Dimorfismo de ascósporos em Glomerella cingulata f.sp. phaseoli

    OpenAIRE

    Castro, Renata A.; Mendes-Costa, Maria C.; Souza, Elaine A.

    2006-01-01

    O objetivo deste trabalho foi caracterizar linhagens de Glomerella cingulata f. sp. phaseoli quanto ao crescimento micelial, formação de peritécio e dimorfismo de ascósporos. Foram avaliadas quatro linhagens nos meios de cultura de folhas de feijoeiro e meio M3 e em condições com e sem fotoperíodo. O comprimento dos ascósporos da linhagem normal e da mutante foi determinado. Houve diferença significativa quanto ao crescimento micelial e tamanho dos ascósporos. Constatou-se, pela primeira vez,...

  7. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  8. USDA-ASCS 1936-1939 Air Photos

    Data.gov (United States)

    Minnesota Department of Natural Resources — This data set is a digital version of aerial photographs taken during the 1936-1939 time frame for the USDA-ASCS. These photos were originally recorded at a scale of...

  9. The AscSimulationMode command

    DEFF Research Database (Denmark)

    Jørgensen, John Leif

    1998-01-01

    Complex instruments like the ASC may be quite difficult to test in closed loops. This problem is augmented by the fact, that no direct stimulation of the CHU is possible that will render the full performance, noise-spectrum and real-timeliness with high fidelity. In order to circumvent this impasse...

  10. Computer software to assess weld thickness loss in offshore pipelines: PEDS

    Energy Technology Data Exchange (ETDEWEB)

    Germano, Andre Luiz Silva; Correa, Samanda Cristine Arruda [Centro Universitario Estadual da Zona Oeste (CCMAT/UEZO), Rio de Janeiro, RJ (Brazil)], e-mail: scorrea@nuclear.ufrj.br; Souza, Edmilson Monteiro de; Silva, Ademir Xavier da; Lopes, Ricardo Tadeu [Programa de Engenharia Nuclear, COPPE, Universidade Federal do Rio de Janeiro (UFRJ), Rio de Janeiro, RJ (Brazil)], e-mails: emonteiro@nuclear.ufrj.br, ademir@nuclear.ufrj.br, ricardo@lin.ufrj.br

    2010-07-01

    The purpose of this work is to present an initial vision about a computer software named PEDS to assess weld thickness loss in offshore pipelines through digital radiography. This software calculates the thickness loss through a data bank obtained using computational modeling based on Monte Carlo MCNPX code. In order to give users more flexibility, the computer software was written in Java, which allows it to run on Linux, Mac OSX and Windows. Furthermore, tools are provided to image display, select and analyze specific areas of the image (measure average, area of selection) and generate profile plots. Applications of this software in the offshore area are presented. (author)

  11. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  12. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  13. Quality assurance of nuclear medicine computer software

    International Nuclear Information System (INIS)

    Cradduck, T.D.

    1986-01-01

    Although quality assurance activities have become well established for the hardware found in nuclear medicine little attention has been paid to computer software. This paper outlines some of the problems that exist and indicates some of the solutions presently under development. The major thrust has been towards establishment of programming standards and comprehensive documentation. Some manufacturers have developed installation verification procedures which programmers are urged to use as models for their own programs. Items that tend to cause erroneous results are discussed with the emphasis for error detection and correction being placed on proper education and training of the computer operator. The concept of interchangeable data files or 'software phantoms' for purposes of quality assurance is discussed. (Author)

  14. A Software Rejuvenation Framework for Distributed Computing

    Science.gov (United States)

    Chau, Savio

    2009-01-01

    A performability-oriented conceptual framework for software rejuvenation has been constructed as a means of increasing levels of reliability and performance in distributed stateful computing. As used here, performability-oriented signifies that the construction of the framework is guided by the concept of analyzing the ability of a given computing system to deliver services with gracefully degradable performance. The framework is especially intended to support applications that involve stateful replicas of server computers.

  15. Absence of the inflammasome adaptor ASC reduces hypoxia-induced pulmonary hypertension in mice.

    Science.gov (United States)

    Cero, Fadila Telarevic; Hillestad, Vigdis; Sjaastad, Ivar; Yndestad, Arne; Aukrust, Pål; Ranheim, Trine; Lunde, Ida Gjervold; Olsen, Maria Belland; Lien, Egil; Zhang, Lili; Haugstad, Solveig Bjærum; Løberg, Else Marit; Christensen, Geir; Larsen, Karl-Otto; Skjønsberg, Ole Henning

    2015-08-15

    Pulmonary hypertension is a serious condition that can lead to premature death. The mechanisms involved are incompletely understood although a role for the immune system has been suggested. Inflammasomes are part of the innate immune system and consist of the effector caspase-1 and a receptor, where nucleotide-binding oligomerization domain-like receptor pyrin domain-containing 3 (NLRP3) is the best characterized and interacts with the adaptor protein apoptosis-associated speck-like protein containing a caspase-recruitment domain (ASC). To investigate whether ASC and NLRP3 inflammasome components are involved in hypoxia-induced pulmonary hypertension, we utilized mice deficient in ASC and NLRP3. Active caspase-1, IL-18, and IL-1β, which are regulated by inflammasomes, were measured in lung homogenates in wild-type (WT), ASC(-/-), and NLRP3(-/-) mice, and phenotypical changes related to pulmonary hypertension and right ventricular remodeling were characterized after hypoxic exposure. Right ventricular systolic pressure (RVSP) of ASC(-/-) mice was significantly lower than in WT exposed to hypoxia (40.8 ± 1.5 mmHg vs. 55.8 ± 2.4 mmHg, P right ventricular remodeling. RVSP of NLRP3(-/-) mice exposed to hypoxia was not significantly altered compared with WT hypoxia. Whereas hypoxia increased protein levels of caspase-1, IL-18, and IL-1β in WT and NLRP3(-/-) mice, this response was absent in ASC(-/-) mice. Moreover, ASC(-/-) mice displayed reduced muscularization and collagen deposition around arteries. In conclusion, hypoxia-induced elevated right ventricular pressure and remodeling were attenuated in mice lacking the inflammasome adaptor protein ASC, suggesting that inflammasomes play an important role in the pathogenesis of pulmonary hypertension. Copyright © 2015 the American Physiological Society.

  16. Automating the management of software projects in a developing it ...

    African Journals Online (AJOL)

    Software project management is the control of the transformation of users' ... Model from The American Systems Corporation (ASC) was used for risk management. ... Multi-site development approach facilitates large projects by using simple ...

  17. Building fast, reliable, and adaptive software for computational science

    International Nuclear Information System (INIS)

    Rendell, A P; Antony, J; Armstrong, W; Janes, P; Yang, R

    2008-01-01

    Building fast, reliable, and adaptive software is a constant challenge for computational science, especially given recent developments in computer architecture. This paper outlines some of our efforts to address these three issues in the context of computational chemistry. First, a simple linear performance that can be used to model and predict the performance of Hartree-Fock calculations is discussed. Second, the use of interval arithmetic to assess the numerical reliability of the sort of integrals used in electronic structure methods is presented. Third, use of dynamic code modification as part of a framework to support adaptive software is outlined

  18. 48 CFR 252.227-7026 - Deferred delivery of technical data or computer software.

    Science.gov (United States)

    2010-10-01

    ... technical data or computer software. 252.227-7026 Section 252.227-7026 Federal Acquisition Regulations... data or computer software. As prescribed at 227.7103-8(a), use the following clause: Deferred Delivery of Technical Data or Computer Software (APR 1988) The Government shall have the right to require, at...

  19. Computer software summaries. Numbers 1 through 423

    International Nuclear Information System (INIS)

    1979-09-01

    The National Energy Software Center (NESC) serves as the software exchange and information center for the US Department of Energy and the Nuclear Regulatory Commission. A major activity of the Center is the preparation and publication of two reports issued periodically - the Center's compilation of program abstracts, ANL-7411, and this software summaries report, ANL-8040. The abstracts describe the softward packages available in the software exchange library maintained and distributed by the Center. The summaries describe agency-sponsored software that is at the specification stage, under development, being checked out, in use, or available at agency offices, laboratories, and contractor installations. Summaries describe software that is not included in the NESC library due to its preliminary status or because it is believed to be of limited interest. The purpose of the summaries report is to keep agency and contractor personnel informed as to the existence, status, and availability of computer programs within the agency, and thereby minimize duplication costs and maximize the value of agency software development efforts

  20. Computer software summaries. Numbers 325 through 423

    Energy Technology Data Exchange (ETDEWEB)

    1978-08-01

    The National Energy Software Center (NESC) serves as the software exchange and information center for the U.S. Department of Energy and the Nuclear Regulatory Commission. These summaries describe agency-sponsored software which is at the specification stage, under development, being checked out, in use, or available at agency offices, laboratories, and contractor installations. They describe software which is not included in the NESC library due to its preliminary status or because it is believed to be of limited interest. Codes dealing with the following subjects are included: cross section and resonance integral calculations; spectrum calculations, generation of group constants, lattice and cell problems; static design studies; depletion, fuel management, cost analysis, and power plant economics; space-independent kinetics; space--time kinetics, coupled neutronics--hydrodynamics--thermodynamics; and excursion simulations; radiological safety, hazard and accident analysis; heat transfer and fluid flow; deformation and stress distribution computations, structural analysis, and engineering design studies; gamma heating and shield design; reactor systems analysis; data preparation; data management; subsidiary calculations; experimental data processing; general mathematical and computing system routines; materials; environmental and earth sciences; space sciences; electronics and engineering equipment; chemistry; particle accelerators and high-voltage machines; physics; magnetic fusion research; biology and medicine; and data. (RWR)

  1. Computer software summaries. Numbers 325 through 423

    International Nuclear Information System (INIS)

    1978-08-01

    The National Energy Software Center (NESC) serves as the software exchange and information center for the U.S. Department of Energy and the Nuclear Regulatory Commission. These summaries describe agency-sponsored software which is at the specification stage, under development, being checked out, in use, or available at agency offices, laboratories, and contractor installations. They describe software which is not included in the NESC library due to its preliminary status or because it is believed to be of limited interest. Codes dealing with the following subjects are included: cross section and resonance integral calculations; spectrum calculations, generation of group constants, lattice and cell problems; static design studies; depletion, fuel management, cost analysis, and power plant economics; space-independent kinetics; space--time kinetics, coupled neutronics--hydrodynamics--thermodynamics; and excursion simulations; radiological safety, hazard and accident analysis; heat transfer and fluid flow; deformation and stress distribution computations, structural analysis, and engineering design studies; gamma heating and shield design; reactor systems analysis; data preparation; data management; subsidiary calculations; experimental data processing; general mathematical and computing system routines; materials; environmental and earth sciences; space sciences; electronics and engineering equipment; chemistry; particle accelerators and high-voltage machines; physics; magnetic fusion research; biology and medicine; and data

  2. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Science.gov (United States)

    2011-09-30

    ... Software AGENCY: Nuclear Regulatory Commission. ACTION: Regulatory issue summary; request for comment... computer software package, WESTEMS TM , to demonstrate compliance with Section III, ``Rules for... Software Addressees All holders of, and applicants for, a power reactor operating license or construction...

  3. 77 FR 50724 - Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of...

    Science.gov (United States)

    2012-08-22

    ... review of applications for permits and licenses. The DG entitled ``Developing Software Life Cycle... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission...

  4. Hardware replacements and software tools for digital control computers

    International Nuclear Information System (INIS)

    Walker, R.A.P.; Wang, B-C.; Fung, J.

    1996-01-01

    Technological obsolescence is an on-going challenge for all computer use. By design, and to some extent good fortune, AECL has had a good track record with respect to the march of obsolescence in CANDU digital control computer technology. Recognizing obsolescence as a fact of life, AECL has undertaken a program of supporting the digital control technology of existing CANDU plants. Other AECL groups are developing complete replacement systems for the digital control computers, and more advanced systems for the digital control computers of the future CANDU reactors. This paper presents the results of the efforts of AECL's DCC service support group to replace obsolete digital control computer and related components and to provide friendlier software technology related to the maintenance and use of digital control computers in CANDU. These efforts are expected to extend the current lifespan of existing digital control computers through their mandated life. This group applied two simple rules; the product, whether new or replacement should have a generic basis, and the products should be applicable to both existing CANDU plants and to 'repeat' plant designs built using current design guidelines. While some exceptions do apply, the rules have been met. The generic requirement dictates that the product should not be dependent on any brand technology, and should back-fit to and interface with any such technology which remains in the control design. The application requirement dictates that the product should have universal use and be user friendly to the greatest extent possible. Furthermore, both requirements were designed to anticipate user involvement, modifications and alternate user defined applications. The replacements for hardware components such as paper tape reader/punch, moving arm disk, contact scanner and Ramtek are discussed. The development of these hardware replacements coincide with the development of a gateway system for selected CANDU digital control

  5. Delivering LHC software to HPC compute elements

    CERN Document Server

    Blomer, Jakob; Hardi, Nikola; Popescu, Radu

    2017-01-01

    In recent years, there was a growing interest in improving the utilization of supercomputers by running applications of experiments at the Large Hadron Collider (LHC) at CERN when idle cores cannot be assigned to traditional HPC jobs. At the same time, the upcoming LHC machine and detector upgrades will produce some 60 times higher data rates and challenge LHC experiments to use so far untapped compute resources. LHC experiment applications are tailored to run on high-throughput computing resources and they have a different anatomy than HPC applications. LHC applications comprise a core framework that allows hundreds of researchers to plug in their specific algorithms. The software stacks easily accumulate to many gigabytes for a single release. New releases are often produced on a daily basis. To facilitate the distribution of these software stacks to world-wide distributed computing resources, LHC experiments use a purpose-built, global, POSIX file system, the CernVM File System. CernVM-FS pre-processes dat...

  6. Advanced Simulation and Computing Fiscal Year 14 Implementation Plan, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, Robert [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Matzen, M. Keith [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-09-11

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Moreover, ASC’s business model is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive

  7. Designing Scientific Software for Heterogeneous Computing

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig

    , algorithms and data structures must be designed to utilize the underlying parallel architecture. The architectural changes in hardware design within the last decade, from single to multi and many-core architectures, require software developers to identify and properly implement methods that both exploit...... makes parallel software design applicable, but also a challenge for scientific software developers at all levels. We have developed a generic C++ library for fast prototyping of large-scale PDEs solvers based on flexible-order finite difference approximations on structured regular grids. The library...... is designed with a high abstraction interface to improve developer productivity. The library is based on modern template-based design concepts as described in Glimberg, Engsig-Karup, Nielsen & Dammann (2013). The library utilizes heterogeneous CPU/GPU environments in order to maximize computational throughput...

  8. A directory of computer software applications: energy. Report for 1974--1976

    International Nuclear Information System (INIS)

    Grooms, D.W.

    1977-04-01

    The computer programs or the computer program documentation cited in this directory have been developed for a variety of applications in the field of energy. The cited computer software includes applications in solar energy, petroleum resources, batteries, electrohydrodynamic generators, magnetohydrodynamic generators, natural gas, nuclear fission, nuclear fusion, hydroelectric power production, and geothermal energy. The computer software cited has been used for simulation and modeling, calculations of future energy requirements, calculations of energy conservation measures, and computations of economic considerations of energy systems

  9. Computing platforms for software-defined radio

    CERN Document Server

    Nurmi, Jari; Isoaho, Jouni; Garzia, Fabio

    2017-01-01

    This book addresses Software-Defined Radio (SDR) baseband processing from the computer architecture point of view, providing a detailed exploration of different computing platforms by classifying different approaches, highlighting the common features related to SDR requirements and by showing pros and cons of the proposed solutions. Coverage includes architectures exploiting parallelism by extending single-processor environment (such as VLIW, SIMD, TTA approaches), multi-core platforms distributing the computation to either a homogeneous array or a set of specialized heterogeneous processors, and architectures exploiting fine-grained, coarse-grained, or hybrid reconfigurability. Describes a computer engineering approach to SDR baseband processing hardware; Discusses implementation of numerous compute-intensive signal processing algorithms on single and multicore platforms; Enables deep understanding of optimization techniques related to power and energy consumption of multicore platforms using several basic a...

  10. The family of standard hydrogen monitoring system computer software design description: Revision 2

    International Nuclear Information System (INIS)

    Bender, R.M.

    1994-01-01

    In March 1990, 23 waste tanks at the Hanford Nuclear Reservation were identified as having the potential for the buildup of gas to a flammable or explosive level. As a result of the potential for hydrogen gas buildup, a project was initiated to design a standard hydrogen monitoring system (SHMS) for use at any waste tank to analyze gas samples for hydrogen content. Since it was originally deployed three years ago, two variations of the original system have been developed: the SHMS-B and SHMS-C. All three are currently in operation at the tank farms and will be discussed in this document. To avoid confusion in this document, when a feature is common to all three of the SHMS variants, it will be referred to as ''The family of SHMS.'' When it is specific to only one or two, they will be identified. The purpose of this computer software design document is to provide the following: the computer software requirements specification that documents the essential requirements of the computer software and its external interfaces; the computer software design description; the computer software user documentation for using and maintaining the computer software and any dedicated hardware; and the requirements for computer software design verification and validation

  11. A Novel Coupling Pattern in Computational Science and Engineering Software

    Science.gov (United States)

    Computational science and engineering (CSE) software is written by experts of certain area(s). Due to the specialization,existing CSE software may need to integrate other CSE software systems developed by different groups of experts. Thecoupling problem is one of the challenges f...

  12. A Novel Coupling Pattern in Computational Science and Engineering Software

    Science.gov (United States)

    Computational science and engineering (CSE) software is written by experts of certain area(s). Due to the specialization, existing CSE software may need to integrate other CSE software systems developed by different groups of experts. The coupling problem is one of the challenges...

  13. The December 2006 ATLAS Computing & Software Workshop

    CERN Multimedia

    Fred Luehring

    The 29th ATLAS Computing & Software Workshop was held on December 11-15 at CERN. With the rapidly approaching onset of data taking, the workshop participants had an air of urgency about them. There was considerable discussion on hot topics such as physics validation of the software, data analysis, actual software production on the GRID, and the schedule of work for 2007 including the Final Dress Rehearsal (FDR). However don't be fooled, the workshop was not all work - there were also two social events which were greatly enjoyed by the attendees. The workshop welcomed Wouter Verkerke as the new Physics Validation Coordinator (replacing Davide Costanzo). Most recent validation work has centered on the 12.0.X release series that will be used for the Computing System Commissioning (CSC) exercise. The validation is now a big job because it needs to be done over a variety of conditions (magnetic field on/off, aligned/misaligned geometry) for every candidate release. Luckily there have been a large number of pe...

  14. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    Energy Technology Data Exchange (ETDEWEB)

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  15. Single-Molecule Fluorescence Reveals the Oligomerization and Folding Steps Driving the Prion-like Behavior of ASC.

    Science.gov (United States)

    Gambin, Yann; Giles, Nichole; O'Carroll, Ailís; Polinkovsky, Mark; Hunter, Dominic; Sierecki, Emma

    2018-02-16

    Single-molecule fluorescence has the unique ability to quantify small oligomers and track conformational changes at a single-protein level. Here we tackled one of the most extreme protein behaviors, found recently in an inflammation pathway. Upon danger recognition in the cytosol, NLRP3 recruits its signaling adaptor, ASC. ASC start polymerizing in a prion-like manner and the system goes in "overdrive" by producing a single micron-sized "speck." By precisely controlling protein expression levels in an in vitro translation system, we could trigger the polymerization of ASC and mimic formation of specks in the absence of inflammasome nucleators. We utilized single-molecule spectroscopy to fully characterize prion-like behaviors and self-propagation of ASC fibrils. We next used our controlled system to monitor the conformational changes of ASC upon fibrillation. Indeed, ASC consists of a PYD and CARD domains, separated by a flexible linker. Individually, both domains have been found to form fibrils, but the structure of the polymers formed by the full-length ASC proteins remains elusive. For the first time, using single-molecule Förster resonance energy transfer, we studied the relative positions of the CARD and PYD domains of full-length ASC. An unexpectedly large conformational change occurred upon ASC fibrillation, suggesting that the CARD domain folds back onto the PYD domain. However, contradicting current models, the "prion-like" conformer was not initiated by binding of ASC to the NLRP3 platform. Rather, using a new method, hybrid between Photon Counting Histogram and Number and Brightness analysis, we showed that NLRP3 forms hexamers with self-binding affinities around 300nM. Overall our data suggest a new mechanism, where NLRP3 can initiate ASC polymerization simply by increasing the local concentration of ASC above a supercritical level. Copyright © 2017. Published by Elsevier Ltd.

  16. Computer Software: Copyright and Licensing Considerations for Schools and Libraries. ERIC Digest.

    Science.gov (United States)

    Reed, Mary Hutchings

    This digest notes that the terms and conditions of computer software package license agreements control the use of software in schools and libraries, and examines the implications of computer software license agreements for classroom use and for library lending policies. Guidelines are provided for interpreting the Copyright Act, and insuring the…

  17. Caesy: A software tool for computer-aided engineering

    Science.gov (United States)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  18. Software Engineering for Scientific Computer Simulations

    Science.gov (United States)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  19. Functional requirements for gas characterization system computer software

    International Nuclear Information System (INIS)

    Tate, D.D.

    1996-01-01

    This document provides the Functional Requirements for the Computer Software operating the Gas Characterization System (GCS), which monitors the combustible gasses in the vapor space of selected tanks. Necessary computer functions are defined to support design, testing, operation, and change control. The GCS requires several individual computers to address the control and data acquisition functions of instruments and sensors. These computers are networked for communication, and must multi-task to accommodate operation in parallel

  20. Copyright Protection for Computer Software: Is There a Need for More Protection?

    Science.gov (United States)

    Ku, Linlin

    Because the computer industry's expansion has been much faster than has the development of laws protecting computer software and since the practice of software piracy seems to be alive and well, the issue of whether existing laws can provide effective protection for software needs further discussion. Three bodies of law have been used to protect…

  1. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Argonne National Lab. (ANL), Argonne, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); LeCompte, Tom [Argonne National Lab. (ANL), Argonne, IL (United States); Marshall, Zach [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Borgland, Anders [SLAC National Accelerator Lab., Menlo Park, CA (United States); Viren, Brett [Brookhaven National Lab. (BNL), Upton, NY (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Asai, Makato [SLAC National Accelerator Lab., Menlo Park, CA (United States); Bauerdick, Lothar [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Gottlieb, Steve [Indiana Univ., Bloomington, IN (United States); Hoeche, Stefan [SLAC National Accelerator Lab., Menlo Park, CA (United States); Sheldon, Paul [Vanderbilt Univ., Nashville, TN (United States); Vay, Jean-Luc [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Elmer, Peter [Princeton Univ., NJ (United States); Kirby, Michael [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Patton, Simon [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Potekhin, Maxim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yanny, Brian [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Calafiura, Paolo [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gutsche, Oliver [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Izubuchi, Taku [Brookhaven National Lab. (BNL), Upton, NY (United States); Lyon, Adam [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Petravick, Don [Univ. of Illinois, Urbana-Champaign, IL (United States). National Center for Supercomputing Applications (NCSA)

    2015-10-29

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  2. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  3. Software for simulation of a computed tomography imaging spectrometer using optical design software

    Science.gov (United States)

    Spuhler, Peter T.; Willer, Mark R.; Volin, Curtis E.; Descour, Michael R.; Dereniak, Eustace L.

    2000-11-01

    Our Imaging Spectrometer Simulation Software known under the name Eikon should improve and speed up the design of a Computed Tomography Imaging Spectrometer (CTIS). Eikon uses existing raytracing software to simulate a virtual instrument. Eikon enables designers to virtually run through the design, calibration and data acquisition, saving significant cost and time when designing an instrument. We anticipate that Eikon simulations will improve future designs of CTIS by allowing engineers to explore more instrument options.

  4. A study of HPV typing for the management of HPV-positive ASC-US cervical cytologic results.

    Science.gov (United States)

    Schiffman, Mark; Vaughan, Laurence M; Raine-Bennett, Tina R; Castle, Philip E; Katki, Hormuzd A; Gage, Julia C; Fetterman, Barbara; Befano, Brian; Wentzensen, Nicolas

    2015-09-01

    In US cervical screening, immediate colposcopy is recommended for women with HPV-positive ASC-US (equivocal) cytology. We evaluated whether partial typing by Onclarity™ (BD) might identify HPV-positive women with low enough CIN3+ risk to permit 1-year follow-up instead. The NCI-Kaiser Permanente Northern California Persistence and Progression cohort includes a subset of 13,890 women aged 21+ with HC2 (Qiagen)-positive ASC-US at enrollment; current median follow-up is 3.0years. Using stratified random sampling, we typed 2079 archived enrollment specimens including 329 women subsequently diagnosed with CIN3+, 563 with CIN2, and 1187 with computed 3-year cumulative CIN3+ risks for each Onclarity typing channel, using Kaplan-Meier methods. The 3-year CIN3+ risk for all HC2-positive women with ASC-US was 5.2%; this establishes the "benchmark" risk for colposcopic referral. Hierarchically, 3-year cumulative risks for each typing channel were 16.0% for HPV16, 7.4% for HPV18, 7.0% for HPV31, 7.1% for grouped HPV33/58, 4.3% for HPV52, 3.9% for HPV45, 2.7% for HPV51, 1.6% for HPV39/68/35, and 1.3% for HPV59/56/66. ASC-US linked to HPV16, HPV18, HPV31, or HPV33/58 warrants immediate colposcopy. Optimal management of women with HPV52 or HPV45 is uncertain. Risk of women with only HPV51, HPV39/68/35, or HPV59/56/66 might be low enough to recommend 1-year retesting permitting viral clearance. This strategy would defer colposcopy for 40% of women with HPV-positive ASC-US, half of whom would be cotest-negative at 1-year return. Approximately 10% of those with CIN3 diagnosable at enrollment would be delayed 1year instead. Cost-effectiveness analyses are needed. Published by Elsevier Inc.

  5. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M; Phillips, J; Hpson, J; Meisner, R

    2010-04-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  6. What makes computational open source software libraries successful?

    International Nuclear Information System (INIS)

    Bangerth, Wolfgang; Heister, Timo

    2013-01-01

    Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects. (paper)

  7. What makes computational open source software libraries successful?

    Science.gov (United States)

    Bangerth, Wolfgang; Heister, Timo

    2013-01-01

    Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects.

  8. From On-Premise Software to Cloud Services: The Impact of Cloud Computing on Enterprise Software Vendors' Business Models

    OpenAIRE

    Boillat, Thomas; Legner, Christine

    2013-01-01

    Cloud computing is an emerging paradigm that allows users to conveniently access computing resources as pay-per-use services. Whereas cloud offerings such as Amazon's Elastic Compute Cloud and Google Apps are rapidly gaining a large user base, enterprise software's migration towards the cloud is still in its infancy. For software vendors the move towardscloud solutions implies profound changes in their value-creation logic. Not only are they forced to deliver fully web-enabled solutions and t...

  9. Honeywell modular automation system computer software documentation

    International Nuclear Information System (INIS)

    Cunningham, L.T.

    1997-01-01

    This document provides a Computer Software Documentation for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This system will be used to control new thermal stabilization furnaces in HA-21I

  10. A Comparison of ASCE and FAO56 Reference Evapotranspiration at Different Subdaily Timescales: a Numerical Study

    Directory of Open Access Journals (Sweden)

    Farzin Parchami-Araghi

    2017-01-01

    Full Text Available Introduction: Subdaily estimates of reference evapotranspiration (ETo are needed in many applications such as dynamic agro-hydrological modeling. The ASCE and FAO56 Penman–Monteith models (ASCE-PM and FAO56-PM, respectively has received favorable acceptance and application over much of the world, including the United States, for establishing a reference evapotranspiration (ETo index as a function of weather parameters. In the past several years various studies have evaluated ASCE-PM and FAO56-PM models for calculating the commonest hourly or 15-min ETo either by comparing them with lysimetric measurements or by comparison with one another (2, 3, 5, 9, 10, 11, 16, 17, 19. In this study, sub-daily ET o estimates made by the ASCE-PM and FAO56-PM models at different timescales (1-360 min were compared through conduction of a computational experiment, using a daily to sub-daily disaggregation framework developed by Parchami-Araghi et al. (14. Materials and Methods: Daily and sub-daily weather data at different timescales (1-360 min were generated via a daily-to-sub-daily weather data disaggregation framework developed by Parchami-Araghi et al. (14, using long-term (59 years daily weather data obtained from Abadan synoptic weather station. Daily/sub-daily net long wave radiation (Rnl was estimated through 6 different approaches, including using two different criteria for identifying the daytime/nighttime periods : 1 the standard criteria implemented in both ASCE-PM and FAO56-PM models and 2 criterion of actual time of sunset and sunrise in combination with 1 estimation of clear-sky radiation (Rso based on the standard approach implemented in both ASCE-PM and FAO56-PM models (1st and 2nd Rnl estimation approaches, respectively, 2 integral of the Rso estimates derived via a physically based solar radiation model developed by Yang et al. (25, YNG model, for one-second time-steps (3rd and 4th Rnl estimation approaches, respectively, and 3 integral of

  11. Software and Computing News

    CERN Multimedia

    Barberis, D

    The last several months have been very busy ones for the ATLAS software developers. They've been trying to cope with the competing demands of multiple software stress tests and testbeds. These include Data Challenge Two (DC2), the Combined Testbeam (CTB), preparations for the Physics Workshop to be held in Rome in June 2005, and other testbeds, primarily one for the High-Level Trigger. Data Challenge 2 (DC2) The primary goal of this was to validate the computing model and to provide a test of simulating a day's worth of ATLAS data (10 million events) and of fully processing it and making it available to the physicists within 10 days (i.e. a 10% scale test). DC2 consists of three parts - the generation, simulation, and mixing of a representative sample of physics events with background events; the reconstruction of the mixed samples with initial classification into the different physics signatures; and the distribution of the data to multiple remote sites (Tier-1 centers) for analysis by physicists. Figu...

  12. Computer Software for Life Cycle Cost.

    Science.gov (United States)

    1987-04-01

    34 111. 1111I .25 IL4 jj 16 MICROCOPY RESOLUTION TEST CHART hut FILE C AIR CoMMNAMN STFF COLLG STUJDET PORTO i COMpUTER SOFTWARE FOR LIFE CYCLE CO879...obsolete), physical life (utility before physically wearing out), or application life (utility in a given function)." (7:5) The costs are usually

  13. A Generic Software Development Process Refined from Best Practices for Cloud Computing

    OpenAIRE

    Soojin Park; Mansoo Hwang; Sangeun Lee; Young B. Park

    2015-01-01

    Cloud computing has emerged as more than just a piece of technology, it is rather a new IT paradigm. The philosophy behind cloud computing shares its view with green computing where computing environments and resources are not as subjects to own but as subjects of sustained use. However, converting currently used IT services to Software as a Service (SaaS) cloud computing environments introduces several new risks. To mitigate such risks, existing software development processes must undergo si...

  14. A computer-aided software-tool for sustainable process synthesis-intensification

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Babi, Deenesh K.; Bottlaender, Jack

    2017-01-01

    and determine within the design space, the more sustainable processes. In this paper, an integrated computer-aided software-tool that searches the design space for hybrid/intensified more sustainable process options is presented. Embedded within the software architecture are process synthesis...... operations as well as reported hybrid/intensified unit operations is large and can be difficult to manually navigate in order to determine the best process flowsheet for the production of a desired chemical product. Therefore, it is beneficial to utilize computer-aided methods and tools to enumerate, analyze...... constraints while also matching the design targets, they are therefore more sustainable than the base case. The application of the software-tool to the production of biodiesel is presented, highlighting the main features of the computer-aided, multi-stage, multi-scale methods that are able to determine more...

  15. Fun and software exploring pleasure, paradox and pain in computing

    CERN Document Server

    Goriunova, Olga

    2014-01-01

    Fun and Software offers the untold story of fun as constitutive of the culture and aesthetics of computing. Fun in computing is a mode of thinking, making and experiencing. It invokes and convolutes the question of rationalism and logical reason, addresses the sensibilities and experience of computation and attests to its creative drives. By exploring topics as diverse as the pleasure and pain of the programmer, geek wit, affects of play and coding as a bodily pursuit of the unique in recursive structures, Fun and Software helps construct a different point of entry to the understanding of soft

  16. SOFTWARE FOR COMPUTER-AIDED DESIGN OF CROSS-WEDGE ROLLING

    OpenAIRE

    A. A. Abramov; S. V. Medvedev

    2013-01-01

    The issues of computer technology creation of 3D-design and engineering analysis of metal forming processes using cross wedge rolling methods (CWR) are considered. The developed software for computer-aided design and simulation of cross-wedge rolling is described.

  17. Advanced Simulation and Computing FY08-09 Implementation Plan Volume 2 Revision 0

    International Nuclear Information System (INIS)

    McCoy, M; Kusnezov, D; Bikkel, T; Hopson, J

    2007-01-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  18. Advanced Simulation and Computing FY10-11 Implementation Plan Volume 2, Rev. 0

    Energy Technology Data Exchange (ETDEWEB)

    Carnes, B

    2009-06-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  19. Required number of records for ASCE/SEI 7 ground-motion scaling procedure

    Science.gov (United States)

    Reyes, Juan C.; Kalkan, Erol

    2011-01-01

    The procedures and criteria in 2006 IBC (International Council of Building Officials, 2006) and 2007 CBC (International Council of Building Officials, 2007) for the selection and scaling ground-motions for use in nonlinear response history analysis (RHA) of structures are based on ASCE/SEI 7 provisions (ASCE, 2005, 2010). According to ASCE/SEI 7, earthquake records should be selected from events of magnitudes, fault distance, and source mechanisms that comply with the maximum considered earthquake, and then scaled so that the average value of the 5-percent-damped response spectra for the set of scaled records is not less than the design response spectrum over the period range from 0.2Tn to 1.5Tn sec (where Tn is the fundamental vibration period of the structure). If at least seven ground-motions are analyzed, the design values of engineering demand parameters (EDPs) are taken as the average of the EDPs determined from the analyses. If fewer than seven ground-motions are analyzed, the design values of EDPs are taken as the maximum values of the EDPs. ASCE/SEI 7 requires a minimum of three ground-motions. These limits on the number of records in the ASCE/SEI 7 procedure are based on engineering experience, rather than on a comprehensive evaluation. This study statistically examines the required number of records for the ASCE/SEI 7 procedure, such that the scaled records provide accurate, efficient, and consistent estimates of" true" structural responses. Based on elastic-perfectly-plastic and bilinear single-degree-of-freedom systems, the ASCE/SEI 7 scaling procedure is applied to 480 sets of ground-motions. The number of records in these sets varies from three to ten. The records in each set were selected either (i) randomly, (ii) considering their spectral shapes, or (iii) considering their spectral shapes and design spectral-acceleration value, A(Tn). As compared to benchmark (that is, "true") responses from unscaled records using a larger catalog of ground

  20. 34 CFR 464.42 - What limit applies to purchasing computer hardware and software?

    Science.gov (United States)

    2010-07-01

    ... software? 464.42 Section 464.42 Education Regulations of the Offices of the Department of Education... computer hardware and software? Not more than ten percent of funds received under any grant under this part may be used to purchase computer hardware or software. (Authority: 20 U.S.C. 1208aa(f)) ...

  1. The classification and evaluation of Computer-Aided Software Engineering tools

    OpenAIRE

    Manley, Gary W.

    1990-01-01

    Approved for public release; distribution unlimited. The use of Computer-Aided Software Engineering (CASE) tools has been viewed as a remedy for the software development crisis by achieving improved productivity and system quality via the automation of all or part of the software engineering process. The proliferation and tremendous variety of tools available have stretched the understanding of experienced practitioners and has had a profound impact on the software engineering process itse...

  2. Structure and assembly of the mouse ASC inflammasome by combined NMR spectroscopy and cryo-electron microscopy

    Science.gov (United States)

    Sborgi, Lorenzo; Ravotti, Francesco; Dandey, Venkata P.; Dick, Mathias S.; Mazur, Adam; Reckel, Sina; Chami, Mohamed; Scherer, Sebastian; Huber, Matthias; Böckmann, Anja; Egelman, Edward H.; Stahlberg, Henning; Broz, Petr; Meier, Beat H.; Hiller, Sebastian

    2015-01-01

    Inflammasomes are multiprotein complexes that control the innate immune response by activating caspase-1, thus promoting the secretion of cytokines in response to invading pathogens and endogenous triggers. Assembly of inflammasomes is induced by activation of a receptor protein. Many inflammasome receptors require the adapter protein ASC [apoptosis-associated speck-like protein containing a caspase-recruitment domain (CARD)], which consists of two domains, the N-terminal pyrin domain (PYD) and the C-terminal CARD. Upon activation, ASC forms large oligomeric filaments, which facilitate procaspase-1 recruitment. Here, we characterize the structure and filament formation of mouse ASC in vitro at atomic resolution. Information from cryo-electron microscopy and solid-state NMR spectroscopy is combined in a single structure calculation to obtain the atomic-resolution structure of the ASC filament. Perturbations of NMR resonances upon filament formation monitor the specific binding interfaces of ASC-PYD association. Importantly, NMR experiments show the rigidity of the PYD forming the core of the filament as well as the high mobility of the CARD relative to this core. The findings are validated by structure-based mutagenesis experiments in cultured macrophages. The 3D structure of the mouse ASC-PYD filament is highly similar to the recently determined human ASC-PYD filament, suggesting evolutionary conservation of ASC-dependent inflammasome mechanisms. PMID:26464513

  3. Re-engineering software systems in the Department of Defense using integrated computer aided software engineering tools

    OpenAIRE

    Jennings, Charles A.

    1992-01-01

    Approved for public release; distribution is unlimited The Department of Defense (DoD) is plagues with severe cost overruns and delays in developing software systems. Existing software within Dod, some developed 15-to 20 years ago, require continual maintenance and modification. Major difficulties arise with maintaining older systems due to cryptic source code and a lack of adequate documentation. To remedy this situation, the DoD, is pursuing the integrated computer aided software engi...

  4. A Generic Software Development Process Refined from Best Practices for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Soojin Park

    2015-04-01

    Full Text Available Cloud computing has emerged as more than just a piece of technology, it is rather a new IT paradigm. The philosophy behind cloud computing shares its view with green computing where computing environments and resources are not as subjects to own but as subjects of sustained use. However, converting currently used IT services to Software as a Service (SaaS cloud computing environments introduces several new risks. To mitigate such risks, existing software development processes must undergo significant remodeling. This study analyzes actual cases of SaaS cloud computing environment adoption as a way to derive four new best practices for software development and incorporates the identified best practices for currently-in-use processes. Furthermore, this study presents a design for generic software development processes that implement the proposed best practices. The design for the generic process has been applied to reinforce the weak points found in SaaS cloud service development practices used by eight enterprises currently developing or operating actual SaaS cloud computing services. Lastly, this study evaluates the applicability of the proposed SaaS cloud oriented development process through analyzing the feedback data collected from actual application to the development of a SaaS cloud service Astation.

  5. A Methodological Framework for Software Safety in Safety Critical Computer Systems

    OpenAIRE

    P. V. Srinivas Acharyulu; P. Seetharamaiah

    2012-01-01

    Software safety must deal with the principles of safety management, safety engineering and software engineering for developing safety-critical computer systems, with the target of making the system safe, risk-free and fail-safe in addition to provide a clarified differentaition for assessing and evaluating the risk, with the principles of software risk management. Problem statement: Prevailing software quality models, standards were not subsisting in adequately addressing the software safety ...

  6. Software For Computer-Aided Design Of Control Systems

    Science.gov (United States)

    Wette, Matthew

    1994-01-01

    Computer Aided Engineering System (CAESY) software developed to provide means to evaluate methods for dealing with users' needs in computer-aided design of control systems. Interpreter program for performing engineering calculations. Incorporates features of both Ada and MATLAB. Designed to be flexible and powerful. Includes internally defined functions, procedures and provides for definition of functions and procedures by user. Written in C language.

  7. Overview of the ANS [American Nuclear Society] mathematics and computation software standards

    International Nuclear Information System (INIS)

    Smetana, A.O.

    1991-01-01

    The Mathematics and Computations Division of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains four ANSI/ANS software standards. These standards are: Recommended Programming Practices to Facilitate the Portability of Scientific Computer Programs, ANS-10.2; Guidelines for the Documentation of Computer Software, ANS-10.3; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Guidelines for Accommodating User Needs in Computer Program Development, ANS-10.5. 5 refs

  8. Comparison of Simulated Microgravity and Hydrostatic Pressure for Chondrogenesis of hASC.

    Science.gov (United States)

    Mellor, Liliana F; Steward, Andrew J; Nordberg, Rachel C; Taylor, Michael A; Loboa, Elizabeth G

    2017-04-01

    Cartilage tissue engineering is a growing field due to the lack of regenerative capacity of native tissue. The use of bioreactors for cartilage tissue engineering is common, but the results are controversial. Some studies suggest that microgravity bioreactors are ideal for chondrogenesis, while others show that mimicking hydrostatic pressure is crucial for cartilage formation. A parallel study comparing the effects of loading and unloading on chondrogenesis has not been performed. The goal of this study was to evaluate chondrogenesis of human adipose-derived stem cells (hASC) under two different mechanical stimuli relative to static culture: microgravity and cyclic hydrostatic pressure (CHP). Pellets of hASC were cultured for 14 d under simulated microgravity using a rotating wall vessel bioreactor or under CHP (7.5 MPa, 1 Hz, 4 h · d-1) using a hydrostatic pressure vessel. We found that CHP increased mRNA expression of Aggrecan, Sox9, and Collagen II, caused a threefold increase in sulfated glycosaminoglycan production, and resulted in stronger vimentin staining intensity and organization relative to microgravity. In addition, Wnt-signaling patterns were altered in a manner that suggests that simulated microgravity decreases chondrogenic differentiation when compared to CHP. Our goal was to compare chondrogenic differentiation of hASC using a microgravity bioreactor and a hydrostatic pressure vessel, two commonly used bioreactors in cartilage tissue engineering. Our results indicate that CHP promotes hASC chondrogenesis and that microgravity may inhibit hASC chondrogenesis. Our findings further suggest that cartilage formation and regeneration might be compromised in space due to the lack of mechanical loading.Mellor LF, Steward AJ, Nordberg RC, Taylor MA, Loboa EG. Comparison of simulated microgravity and hydrostatic pressure for chondrogenesis of hASC. Aerosp Med Hum Perform. 2017; 88(4):377-384.

  9. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  10. Honeywell Modular Automation System Computer Software Documentation

    International Nuclear Information System (INIS)

    CUNNINGHAM, L.T.

    1999-01-01

    This document provides a Computer Software Documentation for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This system will be used to control new thermal stabilization furnaces in HA-211 and vertical denitration calciner in HC-230C-2

  11. Optimizing the ASC WAN: evaluating network performance tools for comparing transport protocols.

    Energy Technology Data Exchange (ETDEWEB)

    Lydick, Christopher L.

    2007-07-01

    The Advanced Simulation & Computing Wide Area Network (ASC WAN), which is a high delay-bandwidth network connection between US Department of Energy National Laboratories, is constantly being examined and evaluated for efficiency. One of the current transport-layer protocols which is used, TCP, was developed for traffic demands which are different from that on the ASC WAN. The Stream Control Transport Protocol (SCTP), on the other hand, has shown characteristics which make it more appealing to networks such as these. Most important, before considering a replacement for TCP on any network, a testing tool that performs well against certain criteria needs to be found. In order to try to find such a tool, two popular networking tools (Netperf v.2.4.3 & v.2.4.6 (OpenSS7 STREAMS), and Iperf v.2.0.6) were tested. These tools implement both TCP and SCTP and were evaluated using four metrics: (1) How effectively can the tool reach a throughput near the bandwidth? (2) How much of the CPU does the tool utilize during operation? (3) Is the tool freely and widely available? And, (4) Is the tool actively developed? Following the analysis of those tools, this paper goes further into explaining some recommendations and ideas for future work.

  12. The ''NAIRI-2'' computer plotter software

    International Nuclear Information System (INIS)

    Aksenova, E.K.; Kol'ga, V.V.; Trejbal, Z.

    1977-01-01

    The software is described for the grapher of the computer ''Nairi-2''. The system of subprograms ''Plot'' written in the machine language of ''Nairi-2'' allows to present graphically the information obtained with the computer ''Nairi-2'' and with basis computers (BESM-6, CDC-6500) through the information processing system. The graphic dependence can be represented on a pre-selected scale either as a continuous line with a program linear interpolation between the points with the plotting of coordinates of the x, y axes, or as separate points with the construction of the x, y coordinates axes, in any prescribed direction. The system of subprograms is operated in a language of autoprogramming with the application of a number of new operators introduced into the translator of ''Nairi-2''

  13. Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.

    Science.gov (United States)

    Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander

    2018-04-10

    A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing

  14. The Implementation of Computer Data Processing Software for EAST NBI

    International Nuclear Information System (INIS)

    Zhang Xiaodan; Hu Chundong; Sheng Peng; Zhao Yuanzhe; Wu Deyun; Cui Qinglong

    2014-01-01

    One of the most important project missions of neutral beam injectors is the implementation of 100 s neutral beam injection (NBI) with high power energy to the plasma of the EAST superconducting tokamak. Correspondingly, it's necessary to construct a high-speed and reliable computer data processing system for processing experimental data, such as data acquisition, data compression and storage, data decompression and query, as well as data analysis. The implementation of computer data processing application software (CDPS) for EAST NBI is presented in this paper in terms of its functional structure and system realization. The set of software is programmed in C language and runs on Linux operating system based on TCP network protocol and multi-threading technology. The hardware mainly includes industrial control computer (IPC), data server, PXI DAQ cards and so on. Now this software has been applied to EAST NBI system, and experimental results show that the CDPS can serve EAST NBI very well. (fusion engineering)

  15. Benchmarking therapeutic drug monitoring software: a review of available computer tools.

    Science.gov (United States)

    Fuchs, Aline; Csajka, Chantal; Thoma, Yann; Buclin, Thierry; Widmer, Nicolas

    2013-01-01

    Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare

  16. MicroASC instrument onboard Juno spacecraft utilizing inertially controlled imaging

    DEFF Research Database (Denmark)

    Pedersen, David Arge Klevang; Jørgensen, Andreas Härstedt; Benn, Mathias

    2016-01-01

    This contribution describes the post-processing of the raw image data acquired by the microASC instrument during the Earth-fly-by of the Juno spacecraft. The images show a unique view of the Earth and Moon system as seen from afar. The procedure utilizes attitude measurements and inter......-calibration of the Camera Head Units of the microASC system to trigger the image capturing. The triggering is synchronized with the inertial attitude and rotational phase of the sensor acquiring the images. This is essentially works as inertially controlled imaging facilitating image acquisition from unexplored...

  17. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  18. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  19. Application of software technology to a future spacecraft computer design

    Science.gov (United States)

    Labaugh, R. J.

    1980-01-01

    A study was conducted to determine how major improvements in spacecraft computer systems can be obtained from recent advances in hardware and software technology. Investigations into integrated circuit technology indicated that the CMOS/SOS chip set being developed for the Air Force Avionics Laboratory at Wright Patterson had the best potential for improving the performance of spaceborne computer systems. An integral part of the chip set is the bit slice arithmetic and logic unit. The flexibility allowed by microprogramming, combined with the software investigations, led to the specification of a baseline architecture and instruction set.

  20. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  1. Software For Computer-Security Audits

    Science.gov (United States)

    Arndt, Kate; Lonsford, Emily

    1994-01-01

    Information relevant to potential breaches of security gathered efficiently. Automated Auditing Tools for VAX/VMS program includes following automated software tools performing noted tasks: Privileged ID Identification, program identifies users and their privileges to circumvent existing computer security measures; Critical File Protection, critical files not properly protected identified; Inactive ID Identification, identifications of users no longer in use found; Password Lifetime Review, maximum lifetimes of passwords of all identifications determined; and Password Length Review, minimum allowed length of passwords of all identifications determined. Written in DEC VAX DCL language.

  2. Imprinting Community College Computer Science Education with Software Engineering Principles

    Science.gov (United States)

    Hundley, Jacqueline Holliday

    2012-01-01

    Although the two-year curriculum guide includes coverage of all eight software engineering core topics, the computer science courses taught in Alabama community colleges limit student exposure to the programming, or coding, phase of the software development lifecycle and offer little experience in requirements analysis, design, testing, and…

  3. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    International Nuclear Information System (INIS)

    Manimaran, M.; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-01-01

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored

  4. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    Energy Technology Data Exchange (ETDEWEB)

    Manimaran, M., E-mail: maran@igcar.gov.in; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-10-15

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored.

  5. Software designs of image processing tasks with incremental refinement of computation.

    Science.gov (United States)

    Anastasia, Davide; Andreopoulos, Yiannis

    2010-08-01

    Software realizations of computationally-demanding image processing tasks (e.g., image transforms and convolution) do not currently provide graceful degradation when their clock-cycles budgets are reduced, e.g., when delay deadlines are imposed in a multitasking environment to meet throughput requirements. This is an important obstacle in the quest for full utilization of modern programmable platforms' capabilities since worst-case considerations must be in place for reasonable quality of results. In this paper, we propose (and make available online) platform-independent software designs performing bitplane-based computation combined with an incremental packing framework in order to realize block transforms, 2-D convolution and frame-by-frame block matching. The proposed framework realizes incremental computation: progressive processing of input-source increments improves the output quality monotonically. Comparisons with the equivalent nonincremental software realization of each algorithm reveal that, for the same precision of the result, the proposed approach can lead to comparable or faster execution, while it can be arbitrarily terminated and provide the result up to the computed precision. Application examples with region-of-interest based incremental computation, task scheduling per frame, and energy-distortion scalability verify that our proposal provides significant performance scalability with graceful degradation.

  6. SoftLab: A Soft-Computing Software for Experimental Research with Commercialization Aspects

    Science.gov (United States)

    Akbarzadeh-T, M.-R.; Shaikh, T. S.; Ren, J.; Hubbell, Rob; Kumbla, K. K.; Jamshidi, M

    1998-01-01

    SoftLab is a software environment for research and development in intelligent modeling/control using soft-computing paradigms such as fuzzy logic, neural networks, genetic algorithms, and genetic programs. SoftLab addresses the inadequacies of the existing soft-computing software by supporting comprehensive multidisciplinary functionalities from management tools to engineering systems. Furthermore, the built-in features help the user process/analyze information more efficiently by a friendly yet powerful interface, and will allow the user to specify user-specific processing modules, hence adding to the standard configuration of the software environment.

  7. [Effects of chronic fluoxetine treatment on manifestation of sexual motivation and social behavior in mice of ASC line].

    Science.gov (United States)

    Tikhonova, M A; Otroshchenko, E A; Kulikov, A V

    2010-02-01

    Sexual dysfunctions are the typical symptoms accompanying depressive disorders. However antidepressants which improve general state of the patients have no effect on sexual disorders. Mice of ASC (Antidepressant Sensitive Catalepsy) line with high hereditary predisposition to catalepsy were proposed as a model of genetically associated depressive-like condition. The work was aimed at comparison of behavioral indices of sexual motivation and social interest of ASC mice with those of mice of parental inbred AKR and CBA strains, and at the study of the effects of chronic fluoxetine treatment in doses of 10 and 20 mg/kg on these parameters in ASC mice. ASC males demonstrated reduced sexual motivation which was not corrected by fluoxetine. ASC mice did not differ in the expression of social interest and aggression towards juvenile intruder from mice of parental strains. Fluoxetine failed to alter social behavior of ASC mice in social interaction test but its higher dose decreased percentage of aggressors. ASC mouse line seems to be a perspective model to study genetic mechanisms of sexual dysfunctions associated with depressive conditions.

  8. Measuring the impact of computer resource quality on the software development process and product

    Science.gov (United States)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  9. Analytical exploration of the thermodynamic potentials by using symbolic computation software

    International Nuclear Information System (INIS)

    Hantsaridou, Anastasia P; Polatoglou, Hariton M

    2005-01-01

    Thermodynamics is a very general theory, based on fundamental symmetries. It generalizes classical mechanics and incorporates theoretical concepts such as field and field equations. Although all these ingredients are of the highest importance for a scientist, they are not given the attention they perhaps deserve in most undergraduate courses. Nowadays, powerful computers in conjunction with equally powerful software can ease the exploration of the crucial ideas of thermodynamics. The purpose of the present work is to show how the utilization of symbolic computation software can lead to a complementary understanding of thermodynamics. The method was applied to first and second year physics students in the Aristotle University of Thessaloniki (Greece) during the 2002-2003 academic year. The results indicate that symbolic computation software is appropriate not only for enhancing the teaching of the fundamental principles in thermodynamics and their applications, but also for increasing students' motivation for learning

  10. Acute stress among adolescents and female rape victims measured by ASC-Kids: a pilot study.

    Science.gov (United States)

    Nilsson, Doris; Nordenstam, Carin; Green, Sara; Wetterhall, Annika; Lundin, Tom; Svedin, Carl Göran

    2015-01-01

    Rape is considered a stressful trauma and often with durable consequences. How the aftermath of rape is for young adolescents' girls considering acute stress is an overlooked field and remains to be studied. In this study, we wanted to investigate acute stress among adolescent victims of rape and the psychometric properties of the Acute Stress Checklist for Children (ASC-Kids). A clinical sample (n = 79) of raped girls, 13-17 years old who had turned to a special rape victim unit for treatment, answered the ASC-Kids. ASC-Kids was also given to a group of minor stressed, non-raped adolescents in the same age range (n = 154) together with the University of California at Los Angeles Post-traumatic Stress Disorder Reaction Index (UCLA PTSD RI), and the Sense of Coherence Scale 13 (SOC-13). The scores from the groups were compared and showed significant differences in mean values on all the diagnostic criteria of acute stress disorder. In the clinical group, 36.7% obtained full ASD criteria. ASC-Kids could discriminate well between groups. Cronbach's alpha was found to be excellent, and the correlation between the UCLA PTSD RI and ASC-Kids found to be good; both ASC-Kids and UCLA PTSD RI had a good and moderate negative correlation with SOC-13. Adolescent female rape victims were shown to have a very high level of acute stress, and the ASC-Kids was found to have sound psychometrics and can be a valuable screening instrument to support clinicians in their assessments of an indication of adolescents after potentially stressful events such as rape.

  11. A NEW CONTROL CIRCUIT AND COMPUTER SOFTWARE FOR CONTROLING PHOTOVOLTAIC SYSTEMS

    Directory of Open Access Journals (Sweden)

    Mustafa Berkant SELEK

    2008-02-01

    Full Text Available In this study, a new microcontroller circuit was designed and new computer software was implemented to control power flow currents of renewable energy system, which is established in Solar Energy Institute, Ege University, Bornova, Izmir, Turkey. PIC18F452 microcontroller based electronic circuit was designed to control another electronic circuit that includes power electronic switching components. Readily available standard control circuits are designed for switching single level inverters. In contrary, implemented circuit allows to switch multilevel inverters. In addition, because the efficiency of solar energy panels is considerably low, solar panels should be operated under the maximum power point (MPP. Therefore, MPP algorithm is included in the designed control circuit. Next, the control circuit also includes a serial communication interface based on RS232 standard. Using this interface enables the user to choose all functions available in the control circuit and take status report via computer software. Last, a general purpose command set was designed to establish communication between the computer software and the microcontroller-based control circuit. As a result, it is aimed that this study supply a basis for the researchers who want to develop own control circuits or more visual software.

  12. Visualization on supercomputing platform level II ASC milestone (3537-1B) results from Sandia.

    Energy Technology Data Exchange (ETDEWEB)

    Geveci, Berk (Kitware, Inc., Clifton Park, NY); Fabian, Nathan; Marion, Patrick (Kitware, Inc., Clifton Park, NY); Moreland, Kenneth D.

    2010-09-01

    This report provides documentation for the completion of the Sandia portion of the ASC Level II Visualization on the platform milestone. This ASC Level II milestone is a joint milestone between Sandia National Laboratories and Los Alamos National Laboratories. This milestone contains functionality required for performing visualization directly on a supercomputing platform, which is necessary for peta-scale visualization. Sandia's contribution concerns in-situ visualization, running a visualization in tandem with a solver. Visualization and analysis of petascale data is limited by several factors which must be addressed as ACES delivers the Cielo platform. Two primary difficulties are: (1) Performance of interactive rendering, which is most computationally intensive portion of the visualization process. For terascale platforms, commodity clusters with graphics processors(GPUs) have been used for interactive rendering. For petascale platforms, visualization and rendering may be able to run efficiently on the supercomputer platform itself. (2) I/O bandwidth, which limits how much information can be written to disk. If we simply analyze the sparse information that is saved to disk we miss the opportunity to analyze the rich information produced every timestep by the simulation. For the first issue, we are pursuing in-situ analysis, in which simulations are coupled directly with analysis libraries at runtime. This milestone will evaluate the visualization and rendering performance of current and next generation supercomputers in contrast to GPU-based visualization clusters, and evaluate the performance of common analysis libraries coupled with the simulation that analyze and write data to disk during a running simulation. This milestone will explore, evaluate and advance the maturity level of these technologies and their applicability to problems of interest to the ASC program. Scientific simulation on parallel supercomputers is traditionally performed in four

  13. Certainty in Stockpile Computing: Recommending a Verification and Validation Program for Scientific Software

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J.R.

    1998-11-01

    As computing assumes a more central role in managing the nuclear stockpile, the consequences of an erroneous computer simulation could be severe. Computational failures are common in other endeavors and have caused project failures, significant economic loss, and loss of life. This report examines the causes of software failure and proposes steps to mitigate them. A formal verification and validation program for scientific software is recommended and described.

  14. How many records should be used in ASCE/SEI-7 ground motion scaling procedure?

    Science.gov (United States)

    Reyes, Juan C.; Kalkan, Erol

    2012-01-01

    U.S. national building codes refer to the ASCE/SEI-7 provisions for selecting and scaling ground motions for use in nonlinear response history analysis of structures. Because the limiting values for the number of records in the ASCE/SEI-7 are based on engineering experience, this study examines the required number of records statistically, such that the scaled records provide accurate, efficient, and consistent estimates of “true” structural responses. Based on elastic–perfectly plastic and bilinear single-degree-of-freedom systems, the ASCE/SEI-7 scaling procedure is applied to 480 sets of ground motions; the number of records in these sets varies from three to ten. As compared to benchmark responses, it is demonstrated that the ASCE/SEI-7 scaling procedure is conservative if fewer than seven ground motions are employed. Utilizing seven or more randomly selected records provides more accurate estimate of the responses. Selecting records based on their spectral shape and design spectral acceleration increases the accuracy and efficiency of the procedure.

  15. Approximator: Predicting Interruptibility in Software Development with Commodity Computers

    DEFF Research Database (Denmark)

    Tell, Paolo; Jalaliniya, Shahram; Andersen, Kristian S. M.

    2015-01-01

    Assessing the presence and availability of a remote colleague is key in coordination in global software development but is not easily done using existing computer-mediated channels. Previous research has shown that automated estimation of interruptibility is feasible and can achieve a precision....... These early but promising results represent a starting point for designing tools with support for interruptibility capable of improving distributed awareness and cooperation to be used in global software development....

  16. MRI-tracking of transplanted human ASC in a SCID mouse model

    Energy Technology Data Exchange (ETDEWEB)

    Siegmund, Birte J.; Kasten, Annika [Department of Oral and Maxillofacial Surgery, Facial Plastic Surgery, Rostock University Medical Center (Germany); Kühn, Jens-Peter [Institute of Diagnostic Radiology and Neuroradiology, Greifswald University Medical Center (Germany); Winter, Karsten [Institute of Anatomy, Faculty of Medicine, University of Leipzig (Germany); Grüttner, Cordula [Micromod Partikeltechnologie GmbH, Rostock (Germany); Frerich, Bernhard, E-mail: bernhard.frerich@med.uni-rostock.de [Department of Oral and Maxillofacial Surgery, Facial Plastic Surgery, Rostock University Medical Center (Germany)

    2017-04-01

    Background: Regarding strategies improving the efficacy of stem cell transplantation in adipose tissue engineering, cell tracking might be useful. Here we report the in vivo tracking of adipose tissue derived stem cells (ASC) by means of nanoparticle labeling and magnetic resonance imaging (MRI). Here we report the in vivo tracking of adipose tissue derived stromal cells (ASC) by means of nanoparticle labeling and magnetic resonance imaging (MRI). Materials and methods: Human ASC were amplified and labeled with two types of magnetic nanoparticles (MNP), BNF starch and nanomag®-D-spio. Adipose tissue constructs were fabricated by seeding collagen scaffolds with labeled and unlabeled ASCs. Constructs were implanted subcutaneously in the back of severe combined immunodeficient (SCID) mice (n =69, group 1: control with cells w/o label, group 2: BNF starch labeled cells, group 3: nanomag®-D-spio labeled cells). MRI scans were performed at 24 hours, four, twelve and 28 days and four months in a 7.1 T animal device. Explanted constructs were analyzed histomorphometrically. Results: MRI scans showed high contrast of the labeled cells in t2-tse-sequence compared to unlabeled controls. Loss of volume of the implants was observed over time due to partial loss for transplanted cells without significant difference (level of significance p<0.017). Compared to histomorphometry, there was found a positiv correlations in measurement of implant size with a significant at day four (correlation coefficient =0.643; p=0.024) and day twelve (correlation coefficient =0.687; p=0.010). Additional Prussian blue stain showed iron in all implants. Significant differences between the three groups (significance level p<0.017) were found after twelve days between control group and group 3 (p=0.008) and after 28 days between control group and group 2 and 3 (p=0.011). Conclusion: Both MNPs might be suitable for tracking of ASC in vivo and show long term stability over 4 months. - Highlights:

  17. Extreme Scale Computing to Secure the Nation

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L; McGraw, J R; Johnson, J R; Frincke, D

    2009-11-10

    Since the dawn of modern electronic computing in the mid 1940's, U.S. national security programs have been dominant users of every new generation of high-performance computer. Indeed, the first general-purpose electronic computer, ENIAC (the Electronic Numerical Integrator and Computer), was used to calculate the expected explosive yield of early thermonuclear weapons designs. Even the U. S. numerical weather prediction program, another early application for high-performance computing, was initially funded jointly by sponsors that included the U.S. Air Force and Navy, agencies interested in accurate weather predictions to support U.S. military operations. For the decades of the cold war, national security requirements continued to drive the development of high performance computing (HPC), including advancement of the computing hardware and development of sophisticated simulation codes to support weapons and military aircraft design, numerical weather prediction as well as data-intensive applications such as cryptography and cybersecurity U.S. national security concerns continue to drive the development of high-performance computers and software in the U.S. and in fact, events following the end of the cold war have driven an increase in the growth rate of computer performance at the high-end of the market. This mainly derives from our nation's observance of a moratorium on underground nuclear testing beginning in 1992, followed by our voluntary adherence to the Comprehensive Test Ban Treaty (CTBT) beginning in 1995. The CTBT prohibits further underground nuclear tests, which in the past had been a key component of the nation's science-based program for assuring the reliability, performance and safety of U.S. nuclear weapons. In response to this change, the U.S. Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship (SBSS) program in response to the Fiscal Year 1994 National Defense Authorization Act, which requires, 'in the

  18. Service-oriented Software Defined Optical Networks for Cloud Computing

    Science.gov (United States)

    Liu, Yuze; Li, Hui; Ji, Yuefeng

    2017-10-01

    With the development of big data and cloud computing technology, the traditional software-defined network is facing new challenges (e.g., ubiquitous accessibility, higher bandwidth, more flexible management and greater security). This paper proposes a new service-oriented software defined optical network architecture, including a resource layer, a service abstract layer, a control layer and an application layer. We then dwell on the corresponding service providing method. Different service ID is used to identify the service a device can offer. Finally, we experimentally evaluate that proposed service providing method can be applied to transmit different services based on the service ID in the service-oriented software defined optical network.

  19. Software for Distributed Computation on Medical Databases: A Demonstration Project

    Directory of Open Access Journals (Sweden)

    Balasubramanian Narasimhan

    2017-05-01

    Full Text Available Bringing together the information latent in distributed medical databases promises to personalize medical care by enabling reliable, stable modeling of outcomes with rich feature sets (including patient characteristics and treatments received. However, there are barriers to aggregation of medical data, due to lack of standardization of ontologies, privacy concerns, proprietary attitudes toward data, and a reluctance to give up control over end use. Aggregation of data is not always necessary for model fitting. In models based on maximizing a likelihood, the computations can be distributed, with aggregation limited to the intermediate results of calculations on local data, rather than raw data. Distributed fitting is also possible for singular value decomposition. There has been work on the technical aspects of shared computation for particular applications, but little has been published on the software needed to support the "social networking" aspect of shared computing, to reduce the barriers to collaboration. We describe a set of software tools that allow the rapid assembly of a collaborative computational project, based on the flexible and extensible R statistical software and other open source packages, that can work across a heterogeneous collection of database environments, with full transparency to allow local officials concerned with privacy protections to validate the safety of the method. We describe the principles, architecture, and successful test results for the site-stratified Cox model and rank-k singular value decomposition.

  20. Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Matzen, M. Keith [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-16

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.

  1. Advances in Multimedia, Software Engineering and Computing Vol.1 : Proceedings of the 2011 MSEC International Conference on Multimedia, Software Engineering and Computing

    CERN Document Server

    Lin, Sally

    2012-01-01

    MSEC2011 is an integrated conference concentrating its focus upon Multimedia ,Software Engineering, Computing and Education. In the proceeding, you can learn much more knowledge about Multimedia, Software Engineering ,Computing and Education of researchers all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned field. In order to meet high standard of Springer, AISC series ,the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organization had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful.

  2. Advances in Multimedia, Software Engineering and Computing Vol.2 : Proceedings of the 2011 MSEC International Conference on Multimedia, Software Engineering and Computing

    CERN Document Server

    Lin, Sally

    2012-01-01

    MSEC2011 is an integrated conference concentrating its focus upon Multimedia ,Software Engineering, Computing and Education. In the proceeding, you can learn much more knowledge about Multimedia, Software Engineering, Computing and Education of researchers all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned field. In order to meet high standard of Springer, AISC series ,the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organization had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful.

  3. Software Defined Radio Datalink Implementation Using PC-Type Computers

    National Research Council Canada - National Science Library

    Zafeiropoulos, Georgios

    2003-01-01

    The objective of this thesis was to examine the feasibility of implementation and the performance of a Software Defined Radio datalink, using a common PC type host computer and a high level programming language...

  4. The benefit of introducing audit software into curricula for computer ...

    African Journals Online (AJOL)

    The benefit of introducing audit software into curricula for computer auditing students: a student perspective from the University of Pretoria. ... willing to sacrifice more of their time for practical computer classes because they are aware of the beneficial impact on their understanding of the subject as well as their future careers.

  5. The experimental modification of a computer software package for ...

    African Journals Online (AJOL)

    The experimental modification of a computer software package for graphing algebraic functions. ... No Abstract Available South African Journal of Education Vol.25(2) 2005: 61-68. Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  6. V-1 nuclear power plant standby RPP-16S computer software

    International Nuclear Information System (INIS)

    Suchy, R.

    1988-01-01

    The software structure of the function of program modules of the RPP-16S standby computer which is part of the information system of the V-1 Bohunice nuclear power plant are described. The multitasking AMOS operational system is used for the organization of programs in the computer. The program modules are classified in five groups by function, i.e., in modules for the periodical collection of values and for the measurement of process quantities for both nuclear power plant units; for the primary processing of the values; for the monitoring of exceedance of preset limits; for unit operators' communication with the computer. The fifth group consists of users program modules. The standby computer software was tested in the actual operating conditions of the V-1 power plant. The results showed it operated correctly; minor shortcomings were removed. (Z.M.). 1 fig

  7. Software for computer based systems important to safety in nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2004-01-01

    Computer based systems are of increasing importance to safety in nuclear power plants as their use in both new and older plants is rapidly increasing. They are used both in safety related applications, such as some functions of the process control and monitoring systems, as well as in safety critical applications, such as reactor protection or actuation of safety features. The dependability of computer based systems important to safety is therefore of prime interest and should be ensured. With current technology, it is possible in principle to develop computer based instrumentation and control systems for systems important to safety that have the potential for improving the level of safety and reliability with sufficient dependability. However, their dependability can be predicted and demonstrated only if a systematic, fully documented and reviewable engineering process is followed. Although a number of national and international standards dealing with quality assurance for computer based systems important to safety have been or are being prepared, internationally agreed criteria for demonstrating the safety of such systems are not generally available. It is recognized that there may be other ways of providing the necessary safety demonstration than those recommended here. The basic requirements for the design of safety systems for nuclear power plants are provided in the Requirements for Design issued in the IAEA Safety Standards Series.The IAEA has issued a Technical Report to assist Member States in ensuring that computer based systems important to safety in nuclear power plants are safe and properly licensed. The report provides information on current software engineering practices and, together with relevant standards, forms a technical basis for this Safety Guide. The objective of this Safety Guide is to provide guidance on the collection of evidence and preparation of documentation to be used in the safety demonstration for the software for computer based

  8. Software for computer based systems important to safety in nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2005-01-01

    Computer based systems are of increasing importance to safety in nuclear power plants as their use in both new and older plants is rapidly increasing. They are used both in safety related applications, such as some functions of the process control and monitoring systems, as well as in safety critical applications, such as reactor protection or actuation of safety features. The dependability of computer based systems important to safety is therefore of prime interest and should be ensured. With current technology, it is possible in principle to develop computer based instrumentation and control systems for systems important to safety that have the potential for improving the level of safety and reliability with sufficient dependability. However, their dependability can be predicted and demonstrated only if a systematic, fully documented and reviewable engineering process is followed. Although a number of national and international standards dealing with quality assurance for computer based systems important to safety have been or are being prepared, internationally agreed criteria for demonstrating the safety of such systems are not generally available. It is recognized that there may be other ways of providing the necessary safety demonstration than those recommended here. The basic requirements for the design of safety systems for nuclear power plants are provided in the Requirements for Design issued in the IAEA Safety Standards Series.The IAEA has issued a Technical Report to assist Member States in ensuring that computer based systems important to safety in nuclear power plants are safe and properly licensed. The report provides information on current software engineering practices and, together with relevant standards, forms a technical basis for this Safety Guide. The objective of this Safety Guide is to provide guidance on the collection of evidence and preparation of documentation to be used in the safety demonstration for the software for computer based

  9. Software for computer based systems important to safety in nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2000-01-01

    Computer based systems are of increasing importance to safety in nuclear power plants as their use in both new and older plants is rapidly increasing. They are used both in safety related applications, such as some functions of the process control and monitoring systems, as well as in safety critical applications, such as reactor protection or actuation of safety features. The dependability of computer based systems important to safety is therefore of prime interest and should be ensured. With current technology, it is possible in principle to develop computer based instrumentation and control systems for systems important to safety that have the potential for improving the level of safety and reliability with sufficient dependability. However, their dependability can be predicted and demonstrated only if a systematic, fully documented and reviewable engineering process is followed. Although a number of national and international standards dealing with quality assurance for computer based systems important to safety have been or are being prepared, internationally agreed criteria for demonstrating the safety of such systems are not generally available. It is recognized that there may be other ways of providing the necessary safety demonstration than those recommended here. The basic requirements for the design of safety systems for nuclear power plants are provided in the Requirements for Design issued in the IAEA Safety Standards Series.The IAEA has issued a Technical Report to assist Member States in ensuring that computer based systems important to safety in nuclear power plants are safe and properly licensed. The report provides information on current software engineering practices and, together with relevant standards, forms a technical basis for this Safety Guide. The objective of this Safety Guide is to provide guidance on the collection of evidence and preparation of documentation to be used in the safety demonstration for the software for computer based

  10. Report of Investigation Committee on Programs for Research and Development of Strategic Software for Advanced Computing; Kodo computing yo senryakuteki software no kenkyu kaihatsu program kento iinkai hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-12-26

    The committee met on December 26, 2000, with 32 people in attendance. Discussion was made on the results of surveys conducted for the development of strategic software for advanced computing and on candidate projects for strategic software development. Taken up at the meeting were eight subjects which were the interim report on the survey results, semiconductor TCAD (technology computer-aided design) system, nanodevice surface analysis system, network distribution parallel processing platform (tentative name), fatigue simulation system, chemical reaction simulator, protein structure analysis system, and a next-generation fluid analysis system. In this report, the author uses his own way in arranging the discussion results into the four categories of (1) a strategic software development system, (2) popularization method and maintenance system, (3) handling of the results, and (4) the evaluation of the program for research and development. In relation to category (1), it is stated that the software grows up with the passage of time, that the software is a commercial program, and that in the development of a commercial software program the process of basic study up to the preparation of a prototype should be completely separated from the process for its completion. (NEDO)

  11. Incorporating Computer-Aided Software in the Undergraduate Chemical Engineering Core Courses

    Science.gov (United States)

    Alnaizy, Raafat; Abdel-Jabbar, Nabil; Ibrahim, Taleb H.; Husseini, Ghaleb A.

    2014-01-01

    Introductions of computer-aided software and simulators are implemented during the sophomore-year of the chemical engineering (ChE) curriculum at the American University of Sharjah (AUS). Our faculty concurs that software integration within the curriculum is beneficial to our students, as evidenced by the positive feedback received from industry…

  12. A software to report and file by personal computer

    International Nuclear Information System (INIS)

    Di Giandomenico, E.; Filippone, A.; Esposito, A.; Bonomo, L.

    1989-01-01

    During the past four years the authors have been gaining experince in reporting radiological examinations by personal computer. Today they describe the project of a new software which allows the reporting and filing of roentgenograms. This program was realized by a radiologist, using a well known data base management system: dBASE III. The program was shaped to fit the radiologist's needs: it helps to report, and allows to file, radiological data, with the diagnosic codes used by the American College of Radiology. In this paper the authors describe the data base structure and indicate the software functions which make its use possible. Thus, this paper is not aimed at advertising a new reporting program, but at demonstrating how the radiologist can himself manage some aspects of his work with the help of a personal computer

  13. Human-Computer Interaction Software: Lessons Learned, Challenges Ahead

    Science.gov (United States)

    1989-01-01

    domain communi- Iatelligent s t s s Me cation. Users familiar with problem Inteligent support systes. High-func- anddomains but inxperienced with comput...8217i. April 1987, pp. 7.3-78. His research interests include artificial intel- Creating better HCI softw-are will have a 8. S.K Catrd. I.P. Moran. arid

  14. Function of microRNAs in the Osteogenic Differentiation and Therapeutic Application of Adipose-Derived Stem Cells (ASCs

    Directory of Open Access Journals (Sweden)

    Walter M. Hodges

    2017-12-01

    Full Text Available Traumatic wounds with segmental bone defects represent substantial reconstructive challenges. Autologous bone grafting is considered the gold standard for surgical treatment in many cases, but donor site morbidity and associated post-operative complications remain a concern. Advances in regenerative techniques utilizing mesenchymal stem cell populations from bone and adipose tissue have opened the door to improving bone repair in the limbs, spine, and craniofacial skeleton. The widespread availability, ease of extraction, and lack of immunogenicity have made adipose-derived stem cells (ASCs particularly attractive as a stem cell source for regenerative strategies. Recently it has been shown that small, non-coding miRNAs are involved in the osteogenic differentiation of ASCs. Specifically, microRNAs such as miR-17, miR-23a, and miR-31 are expressed during the osteogenic differentiation of ASCs, and appear to play a role in inhibiting various steps in bone morphogenetic protein-2 (BMP2 mediated osteogenesis. Importantly, a number of microRNAs including miR-17 and miR-31 that act to attenuate the osteogenic differentiation of ASCs are themselves stimulated by transforming growth factor β-1 (TGFβ-1. In addition, transforming growth factor β-1 is also known to suppress the expression of microRNAs involved in myogenic differentiation. These data suggest that preconditioning strategies to reduce TGFβ-1 activity in ASCs may improve the therapeutic potential of ASCs for musculoskeletal application. Moreover, these findings support the isolation of ASCs from subcutaneous fat depots that tend to have low endogenous levels of TGFβ-1 expression.

  15. Antiproliferative effect of ASC-J9 delivered by PLGA nanoparticles against estrogen-dependent breast cancer cells.

    Science.gov (United States)

    Verderio, Paolo; Pandolfi, Laura; Mazzucchelli, Serena; Marinozzi, Maria Rosaria; Vanna, Renzo; Gramatica, Furio; Corsi, Fabio; Colombo, Miriam; Morasso, Carlo; Prosperi, Davide

    2014-08-04

    Among polymeric nanoparticles designed for cancer therapy, PLGA nanoparticles have become one of the most popular polymeric devices for chemotherapeutic-based nanoformulations against several kinds of malignant diseases. Promising properties, including long-circulation time, enhanced tumor localization, interference with "multidrug" resistance effects, and environmental biodegradability, often result in an improvement of the drug bioavailability and effectiveness. In the present work, we have synthesized 1,7-bis(3,4-dimethoxyphenyl)-5-hydroxyhepta-1,4,6-trien-3-one (ASC-J9) and developed uniform ASC-J9-loaded PLGA nanoparticles of about 120 nm, which have been prepared by a single-emulsion process. Structural and morphological features of the nanoformulation were analyzed, followed by an accurate evaluation of the in vitro drug release kinetics, which exhibited Fickian law diffusion over 10 days. The intracellular degradation of ASC-J9-bearing nanoparticles within estrogen-dependent MCF-7 breast cancer cells was correlated to a time- and dose-dependent activity of the released drug. A cellular growth inhibition associated with a specific cell cycle G2/M blocking effect caused by ASC-J9 release inside the cytosol allowed us to put forward a hypothesis on the action mechanism of this nanosystem, which led to the final cell apoptosis. Our study was accomplished using Annexin V-based cell death analysis, MTT assessment of proliferation, radical scavenging activity, and intracellular ROS evaluation. Moreover, the intracellular localization of nanoformulated ASC-J9 was confirmed by a Raman optical imaging experiment designed ad hoc. PLGA nanoparticles and ASC-J9 proved also to be safe for a healthy embryo fibroblast cell line (3T3-L1), suggesting a possible clinical translation of this potential nanochemotherapeutic to expand the inherently poor bioavailability of hydrophobic ASC-J9 that could be proposed for the treatment of malignant breast cancer.

  16. Comparison of two three-dimensional cephalometric analysis computer software.

    Science.gov (United States)

    Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek

    2014-10-01

    Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Twenty cone beam computed tomography images were obtained using i-CAT(®) imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (University of Illinois at Chicago, Chicago, IL, USA) software. Before and after orthodontic treatments data were analyzed using t-test. Reliability test using interclass correlation coefficient was stronger for InVivoDental5.0 (0.83-0.98) compared with 3DCeph™ (0.51-0.90). Paired t-test comparison of the two softwares shows no statistical significant difference in the measurements made in the two softwares. InVivoDental5.0 measurements are more reproducible and user friendly when compared to 3DCeph™. No statistical difference between the two softwares in linear or angular measurements. 3DCeph™ is more time-consuming in performing three-dimensional analysis compared with InVivoDental5.0.

  17. Over-expression, purification and characterization of an Asc-1 homologue from Gloeobacter violaceus

    DEFF Research Database (Denmark)

    Wang, Xiaole; Hald, Helle; Ernst, Heidi Asschenfeldt

    2010-01-01

    The human alanine-serine-cysteine transporter 1 (Asc-1) belongs to the slc7a family of solute carrier transporters. Asc-1 mediates the uptake of D-serine in an exchanger-type fashion, coupling the process to the release of alanine and cysteine. Among the bacterial Asc-1 homologues, one transporter...... of auto-induction was crucial for obtaining high yields and purity of the transporter. The transporter was purified with yields in the range of 0.2-0.4 mg per L culture and eluted in a single peak from a size-exclusion column. The circular dichroism spectrum revealed a folded and apparently all...

  18. A portable software tool for computing digitally reconstructed radiographs

    International Nuclear Information System (INIS)

    Chaney, Edward L.; Thorn, Jesse S.; Tracton, Gregg; Cullip, Timothy; Rosenman, Julian G.; Tepper, Joel E.

    1995-01-01

    Purpose: To develop a portable software tool for fast computation of digitally reconstructed radiographs (DRR) with a friendly user interface and versatile image format and display options. To provide a means for interfacing with commercial and custom three-dimensional (3D) treatment planning systems. To make the tool freely available to the Radiation Oncology community. Methods and Materials: A computer program for computing DRRs was enhanced with new features and rewritten to increase computational efficiency. A graphical user interface was added to improve ease of data input and DRR display. Installer, programmer, and user manuals were written, and installation test data sets were developed. The code conforms to the specifications of the Cooperative Working Group (CWG) of the National Cancer Institute (NCI) Contract on Radiotherapy Treatment Planning Tools. Results: The interface allows the user to select DRR input data and image formats primarily by point-and-click mouse operations. Digitally reconstructed radiograph formats are predefined by configuration files that specify 19 calculation parameters. Enhancements include improved contrast resolution for visualizing surgical clips, an extended source model to simulate the penumbra region in a computed port film, and the ability to easily modify the CT numbers of objects contoured on the planning computed tomography (CT) scans. Conclusions: The DRR tool can be used with 3D planning systems that lack this functionality, or perhaps improve the quality and functionality of existing DRR software. The tool can be interfaced to 3D planning systems that run on most modern graphics workstations, and can also function as a stand-alone program

  19. Computer Games as Virtual Environments for Safety-Critical Software Validation

    Directory of Open Access Journals (Sweden)

    Štefan Korečko

    2017-01-01

    Full Text Available Computer games became an inseparable part of everyday life in modern society and the time people spend playing them every day is increasing. This trend caused a noticeable research activity focused on utilizing the time spent playing in a meaningful way, for example to help solving scientific problems or tasks related to computer systems development. In this paper we present one contribution to this activity, a software system consisting of a modified version of the Open Rails train simulator and an application called TS2JavaConn, which allows to use separately developed software controllers with the simulator. The system is intended for validation of controllers developed by formal methods. The paper describes the overall architecture of the system and operation of its components. It also compares the system with other approaches to purposeful utilization of computer games, specifies suitable formal methods and illustrates its intended use on an example.

  20. The Virtual Cell: a software environment for computational cell biology.

    Science.gov (United States)

    Loew, L M; Schaff, J C

    2001-10-01

    The newly emerging field of computational cell biology requires software tools that address the needs of a broad community of scientists. Cell biological processes are controlled by an interacting set of biochemical and electrophysiological events that are distributed within complex cellular structures. Computational modeling is familiar to researchers in fields such as molecular structure, neurobiology and metabolic pathway engineering, and is rapidly emerging in the area of gene expression. Although some of these established modeling approaches can be adapted to address problems of interest to cell biologists, relatively few software development efforts have been directed at the field as a whole. The Virtual Cell is a computational environment designed for cell biologists as well as for mathematical biologists and bioengineers. It serves to aid the construction of cell biological models and the generation of simulations from them. The system enables the formulation of both compartmental and spatial models, the latter with either idealized or experimentally derived geometries of one, two or three dimensions.

  1. A Software Framework for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2009-01-01

    This paper describes a software framework we designed and implemented for the development and research in the area of multimodal human-computer interface. The proposed framework is based on publish / subscribe architecture, which allows developers and researchers to conveniently configure, test and

  2. Fundamentals of civil engineering an introduction to the ASCE body of knowledge

    CERN Document Server

    McCuen, Richard H; Wong, Melanie K

    2011-01-01

    While the ASCE Body of Knowledge (BOK2) is the codified source for all technical and non-technical information necessary for those seeking to attain licensure in civil engineering, recent graduates have notoriously been lacking in the non-technical aspects even as they excel in the technical. Fundamentals of Civil Engineering: An Introduction to the ASCE Body of Knowledge addresses this shortfall and helps budding engineers develop the knowledge, skills, and attitudes suggested and implied by the BOK2. Written as a resource for all of the non-technical outcomes not specifically covered in the

  3. Modeling and performance analysis for composite network–compute service provisioning in software-defined cloud environments

    Directory of Open Access Journals (Sweden)

    Qiang Duan

    2015-08-01

    Full Text Available The crucial role of networking in Cloud computing calls for a holistic vision of both networking and computing systems that leads to composite network–compute service provisioning. Software-Defined Network (SDN is a fundamental advancement in networking that enables network programmability. SDN and software-defined compute/storage systems form a Software-Defined Cloud Environment (SDCE that may greatly facilitate composite network–compute service provisioning to Cloud users. Therefore, networking and computing systems need to be modeled and analyzed as composite service provisioning systems in order to obtain thorough understanding about service performance in SDCEs. In this paper, a novel approach for modeling composite network–compute service capabilities and a technique for evaluating composite network–compute service performance are developed. The analytic method proposed in this paper is general and agnostic to service implementation technologies; thus is applicable to a wide variety of network–compute services in SDCEs. The results obtained in this paper provide useful guidelines for federated control and management of networking and computing resources to achieve Cloud service performance guarantees.

  4. HCI^2 Framework: A software framework for multimodal human-computer interaction systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2013-01-01

    This paper presents a novel software framework for the development and research in the area of multimodal human-computer interface (MHCI) systems. The proposed software framework, which is called the HCI∧2 Framework, is built upon publish/subscribe (P/S) architecture. It implements a

  5. The cognitive dynamics of computer science cost-effective large scale software development

    CERN Document Server

    De Gyurky, Szabolcs Michael; John Wiley & Sons

    2006-01-01

    This book has three major objectives: To propose an ontology for computer software; To provide a methodology for development of large software systems to cost and schedule that is based on the ontology; To offer an alternative vision regarding the development of truly autonomous systems.

  6. The GeoSteiner software package for computing Steiner trees in the plane

    DEFF Research Database (Denmark)

    Juhl, Daniel; Warme, David M.; Winter, Pawel

    The GeoSteiner software package has for more than 10 years been the fastest (publicly available) program for computing exact solutions to Steiner tree problems in the plane. The computational study by Warme, Winter and Zachariasen, published in 2000, documented the performance of the GeoSteiner...... approach --- allowing the exact solution of Steiner tree problems with more than a thousand terminals. Since then, a number of algorithmic enhancements have improved the performance of the software package significantly. In this computational study we run the current code on the largest problem instances...... from the 2000-study, and on a number of larger problem instances. The computational study is performed using both the publicly available GeoSteiner 3.1 code base, and the commercial GeoSteiner 4.0 code base....

  7. Current practice in software development for computational neuroscience and how to improve it.

    Science.gov (United States)

    Gewaltig, Marc-Oliver; Cannon, Robert

    2014-01-01

    Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  8. Current practice in software development for computational neuroscience and how to improve it.

    Directory of Open Access Journals (Sweden)

    Marc-Oliver Gewaltig

    2014-01-01

    Full Text Available Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  9. A software for computer automated radioactive particle tracking

    International Nuclear Information System (INIS)

    Vieira, Wilson S.; Brandao, Luis E.; Braz, Delson

    2008-01-01

    TRACO-1 is the first software developed in Brazil for optimization and diagnosis of multiphase chemical reactors employing the technique known as 'Computer Automated Radioactive Particle Tracking' whose main idea is to follow the movement of a punctual radioactive particle inside a vessel. Considering that this particle has a behavior similar of the phase under investigation, important conclusions can be achieved. As a preliminary TRACO-1 evaluation, a simulation was carried out with the aid of a commercial software called MICROSHIELD, version 5.05, to obtain values of photon counting rates at four detector surfaces. These counting were related to the emission of gamma radiation from a radioactive source because they are the main TRACO-1 input variables. Although the results that has been found are incipient, the analysis of them suggest that the tracking of a radioactive source using TRACO- 1 can be well succeed, but a better evaluation of the capabilities of this software will only be achieved after its application in real experiments. (author)

  10. A community Q&A for HEP Software and Computing ?

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    How often do you use StackOverflow or ServerFault to find information in your daily work? Would you be interested in a community Q&A site for HEP Software and Computing, for instance a dedicated StackExchange site? I looked into this question...

  11. Blind trials of computer-assisted structure elucidation software

    Directory of Open Access Journals (Sweden)

    Moser Arvin

    2012-02-01

    Full Text Available Abstract Background One of the largest challenges in chemistry today remains that of efficiently mining through vast amounts of data in order to elucidate the chemical structure for an unknown compound. The elucidated candidate compound must be fully consistent with the data and any other competing candidates efficiently eliminated without doubt by using additional data if necessary. It has become increasingly necessary to incorporate an in silico structure generation and verification tool to facilitate this elucidation process. An effective structure elucidation software technology aims to mimic the skills of a human in interpreting the complex nature of spectral data while producing a solution within a reasonable amount of time. This type of software is known as computer-assisted structure elucidation or CASE software. A systematic trial of the ACD/Structure Elucidator CASE software was conducted over an extended period of time by analysing a set of single and double-blind trials submitted by a global audience of scientists. The purpose of the blind trials was to reduce subjective bias. Double-blind trials comprised of data where the candidate compound was unknown to both the submitting scientist and the analyst. The level of expertise of the submitting scientist ranged from novice to expert structure elucidation specialists with experience in pharmaceutical, industrial, government and academic environments. Results Beginning in 2003, and for the following nine years, the algorithms and software technology contained within ACD/Structure Elucidator have been tested against 112 data sets; many of these were unique challenges. Of these challenges 9% were double-blind trials. The results of eighteen of the single-blind trials were investigated in detail and included problems of a diverse nature with many of the specific challenges associated with algorithmic structure elucidation such as deficiency in protons, structure symmetry, a large number of

  12. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Science.gov (United States)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  13. Protecting software agents from malicious hosts using quantum computing

    Science.gov (United States)

    Reisner, John; Donkor, Eric

    2000-07-01

    We evaluate how quantum computing can be applied to security problems for software agents. Agent-based computing, which merges technological advances in artificial intelligence and mobile computing, is a rapidly growing domain, especially in applications such as electronic commerce, network management, information retrieval, and mission planning. System security is one of the more eminent research areas in agent-based computing, and the specific problem of protecting a mobile agent from a potentially hostile host is one of the most difficult of these challenges. In this work, we describe our agent model, and discuss the capabilities and limitations of classical solutions to the malicious host problem. Quantum computing may be extremely helpful in addressing the limitations of classical solutions to this problem. This paper highlights some of the areas where quantum computing could be applied to agent security.

  14. Software and man-machine interface considerations for a nuclear plant computer replacement and upgrade project

    International Nuclear Information System (INIS)

    Diamond, G.; Robinson, E.

    1984-01-01

    Some of the key software functions and Man-Machine Interface considerations in a computer replacement and upgrade project for a nuclear power plant are described. The project involves the installation of two separate computer systems: an Emergency Response Facilities Computer System (ERFCS) and a Plant Process Computer System (PPCS). These systems employ state-of-the-art computer hardware and software. The ERFCS is a new system intended to provide enhanced functions to meet NRC post-TMI guidelines. The PPCS is intended to replace and upgrade an existing obsolete plant computer system. A general overview of the hardware and software aspects of the replacement and upgrade is presented. The work done to develop the upgraded Man-Machine Interface is described. For the ERFCS, a detailed discussion is presented of the work done to develop logic to evaluate the readiness and performance of safety systems and their supporting functions. The Man-Machine Interface considerations of reporting readiness and performance to the operator are discussed. Finally, the considerations involved in the implementation of this logic in real-time software are discussed.. For the PPCS, a detailed discussion is presented of some new features

  15. Software Safety Risk in Legacy Safety-Critical Computer Systems

    Science.gov (United States)

    Hill, Janice L.; Baggs, Rhoda

    2007-01-01

    Safety Standards contain technical and process-oriented safety requirements. Technical requirements are those such as "must work" and "must not work" functions in the system. Process-Oriented requirements are software engineering and safety management process requirements. Address the system perspective and some cover just software in the system > NASA-STD-8719.13B Software Safety Standard is the current standard of interest. NASA programs/projects will have their own set of safety requirements derived from the standard. Safety Cases: a) Documented demonstration that a system complies with the specified safety requirements. b) Evidence is gathered on the integrity of the system and put forward as an argued case. [Gardener (ed.)] c) Problems occur when trying to meet safety standards, and thus make retrospective safety cases, in legacy safety-critical computer systems.

  16. ASC Addresses Unit Commanders' Concerns through LBE and Reset Programs

    National Research Council Canada - National Science Library

    Young, Mark E

    2008-01-01

    .... Army Sustainment Command (ASC), part of the U.S. Army Materiel Command (AMC) team, is available to assist, identify, and resolve equipment and maintenance problems as well as materiel readiness issues for combatant commanders...

  17. Learning Vocabulary in a Foreign Language: A Computer Software Based Model Attempt

    Science.gov (United States)

    Yelbay Yilmaz, Yasemin

    2015-01-01

    This study aimed at devising a vocabulary learning software that would help learners learn and retain vocabulary items effectively. Foundation linguistics and learning theories have been adapted to the foreign language vocabulary learning context using a computer software named Parole that was designed exclusively for this study. Experimental…

  18. Computer Game Theories for Designing Motivating Educational Software: A Survey Study

    Science.gov (United States)

    Ang, Chee Siang; Rao, G. S. V. Radha Krishna

    2008-01-01

    The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…

  19. The American Satellite Company (ASC) satellite deployed from payload bay

    Science.gov (United States)

    1985-01-01

    The American Satellite Company (ASC) communications satellite is deployed from the payload bay of the Shuttle Discovery. A portion of the cloudy surface of the earth can be seen to the left of the frame.

  20. Software for the ACP [Advanced Computer Program] multiprocessor system

    International Nuclear Information System (INIS)

    Biel, J.; Areti, H.; Atac, R.

    1987-01-01

    Software has been developed for use with the Fermilab Advanced Computer Program (ACP) multiprocessor system. The software was designed to make a system of a hundred independent node processors as easy to use as a single, powerful CPU. Subroutines have been developed by which a user's host program can send data to and get results from the program running in each of his ACP node processors. Utility programs make it easy to compile and link host and node programs, to debug a node program on an ACP development system, and to submit a debugged program to an ACP production system

  1. The Influence of Personal Characteristics, Interaction: (Computer/Individual), Computer Self-efficacy, Personal Innovativeness in Information Technology to Computer Anxiety in use of Mind your Own Business Accounting Software

    OpenAIRE

    Mayasari, Mega; ., Gudono

    2015-01-01

    The purpose of this study was to identify the factors that cause computer anxiety in the use of Mind Your Own Business (MYOB) accounting software, i.e., to assess if there are any influence of age, gender, amount of training, ownership (usage of accounting software on a regular basis), computer self-efficacy, personal innovativeness in Information Technology (IT) to computer anxiety. The study also examined whether there is a relationship trait anxiety and negative affect to computer self-eff...

  2. Integrating Free Computer Software in Chemistry and Biochemistry Instruction: An International Collaboration

    Science.gov (United States)

    Cedeno, David L.; Jones, Marjorie A.; Friesen, Jon A.; Wirtz, Mark W.; Rios, Luz Amalia; Ocampo, Gonzalo Taborda

    2010-01-01

    At the Universidad de Caldas, Manizales, Colombia, we used their new computer facilities to introduce chemistry graduate students to biochemical database mining and quantum chemistry calculations using freeware. These hands-on workshops allowed the students a strong introduction to easily accessible software and how to use this software to begin…

  3. FY17 ASC CSSE L2 Milestone 6018: Power Usage Characteristics of Workloads Running on Trinity.

    Energy Technology Data Exchange (ETDEWEB)

    Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    The overall goal of this work was to utilize the Advanced Power Management (APM) capabilities of the ATS-1 Trinity platform to understand the power usage behavior of ASC workloads running on Trinity and gain insight into the potential for utilizing power management techniques on future ASC platforms.

  4. Intracellular invasion of Orientia tsutsugamushi activates inflammasome in asc-dependent manner.

    Directory of Open Access Journals (Sweden)

    Jung-Eun Koo

    Full Text Available Orientia tsutsugamushi, a causative agent of scrub typhus, is an obligate intracellular bacterium, which escapes from the endo/phagosome and replicates in the host cytoplasm. O. tsutsugamushi infection induces production of pro-inflammatory mediators including interleukin-1β (IL-1β, which is secreted mainly from macrophages upon cytosolic stimuli by activating cysteine protease caspase-1 within a complex called the inflammasome, and is a key player in initiating and maintaining the inflammatory response. However, the mechanism for IL-1β maturation upon O. tsutsugamushi infection has not been identified. In this study, we show that IL-1 receptor signaling is required for efficient host protection from O. tsutsugamushi infection. Live Orientia, but not heat- or UV-inactivated Orientia, activates the inflammasome through active bacterial uptake and endo/phagosomal maturation. Furthermore, Orientia-stimulated secretion of IL-1β and activation of caspase-1 are ASC- and caspase-1- dependent since IL-1β production was impaired in Asc- and caspase-1-deficient macrophages but not in Nlrp3-, Nlrc4- and Aim2-deficient macrophages. Therefore, live O. tsutsugamushi triggers ASC inflammasome activation leading to IL-1β production, which is a critical innate immune response for effective host defense.

  5. Computer software quality assurance

    International Nuclear Information System (INIS)

    Ives, K.A.

    1986-06-01

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  6. Computer software design description for the integrated control and data acquisition system LDUA system

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This Computer Software Design Description (CSDD) document provides the overview of the software design for all the software that is part of the integrated control and data acquisition system of the Light Duty Utility Arm System (LDUA). It describes the major software components and how they interface. It also references the documents that contain the detailed design description of the components

  7. A directory of computer software applications: astronomy and astrophysics, 1970-May, 1979

    International Nuclear Information System (INIS)

    1979-05-01

    Astronomy and astrophysics reports that list computer programs and/or their documentation are cited. These software applications pertain to topics such as solar activity, atmospheric radiative transfer, stellar and galactic structure, lunar and planetary studies, and astrophysical data reduction. The directory contains complete bibliographic data for each report as well as a subject and a corporate author index. The computer software offered by NTIS was created by a variety of Federal agencies to meet their diverse but quite specific objectives. It is provided without installation, support, or maintenance services and sometimes requires customer modifications to run effectively in customer environments

  8. Cluster computing software for GATE simulations

    International Nuclear Information System (INIS)

    Beenhouwer, Jan de; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R.

    2007-01-01

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values

  9. Free and open-source software application for the evaluation of coronary computed tomography angiography images.

    Science.gov (United States)

    Hadlich, Marcelo Souza; Oliveira, Gláucia Maria Moraes; Feijóo, Raúl A; Azevedo, Clerio F; Tura, Bernardo Rangel; Ziemer, Paulo Gustavo Portela; Blanco, Pablo Javier; Pina, Gustavo; Meira, Márcio; Souza e Silva, Nelson Albuquerque de

    2012-10-01

    The standardization of images used in Medicine in 1993 was performed using the DICOM (Digital Imaging and Communications in Medicine) standard. Several tests use this standard and it is increasingly necessary to design software applications capable of handling this type of image; however, these software applications are not usually free and open-source, and this fact hinders their adjustment to most diverse interests. To develop and validate a free and open-source software application capable of handling DICOM coronary computed tomography angiography images. We developed and tested the ImageLab software in the evaluation of 100 tests randomly selected from a database. We carried out 600 tests divided between two observers using ImageLab and another software sold with Philips Brilliance computed tomography appliances in the evaluation of coronary lesions and plaques around the left main coronary artery (LMCA) and the anterior descending artery (ADA). To evaluate intraobserver, interobserver and intersoftware agreements, we used simple and kappa statistics agreements. The agreements observed between software applications were generally classified as substantial or almost perfect in most comparisons. The ImageLab software agreed with the Philips software in the evaluation of coronary computed tomography angiography tests, especially in patients without lesions, with lesions 70% in the ADA was lower, but this is also observed when the anatomical reference standard is used.

  10. Probability-Based Design Criteria of the ASCE 7 Tsunami Loads and Effects Provisions (Invited)

    Science.gov (United States)

    Chock, G.

    2013-12-01

    Mitigation of tsunami risk requires a combination of emergency preparedness for evacuation in addition to providing structural resilience of critical facilities, infrastructure, and key resources necessary for immediate response and economic and social recovery. Critical facilities would include emergency response, medical, tsunami refuges and shelters, ports and harbors, lifelines, transportation, telecommunications, power, financial institutions, and major industrial/commercial facilities. The Tsunami Loads and Effects Subcommittee of the ASCE/SEI 7 Standards Committee is developing a proposed new Chapter 6 - Tsunami Loads and Effects for the 2016 edition of the ASCE 7 Standard. ASCE 7 provides the minimum design loads and requirements for structures subject to building codes such as the International Building Code utilized in the USA. In this paper we will provide a review emphasizing the intent of these new code provisions and explain the design methodology. The ASCE 7 provisions for Tsunami Loads and Effects enables a set of analysis and design methodologies that are consistent with performance-based engineering based on probabilistic criteria. . The ASCE 7 Tsunami Loads and Effects chapter will be initially applicable only to the states of Alaska, Washington, Oregon, California, and Hawaii. Ground shaking effects and subsidence from a preceding local offshore Maximum Considered Earthquake will also be considered prior to tsunami arrival for Alaska and states in the Pacific Northwest regions governed by nearby offshore subduction earthquakes. For national tsunami design provisions to achieve a consistent reliability standard of structural performance for community resilience, a new generation of tsunami inundation hazard maps for design is required. The lesson of recent tsunami is that historical records alone do not provide a sufficient measure of the potential heights of future tsunamis. Engineering design must consider the occurrence of events greater than

  11. Surveillance Analysis Computer System (SACS): Software requirements specification (SRS). Revision 2

    International Nuclear Information System (INIS)

    Glasscock, J.A.

    1995-01-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) database, an Impact Level 3Q system. SACS stores information on tank temperatures, surface levels, and interstitial liquid levels. This information is retrieved by the customer through a PC-based interface and is then available to a number of other software tools. The software requirements specification (SRS) describes the system requirements for the SACS Project, and follows the Standard Engineering Practices (WHC-CM-6-1), Software Practices (WHC-CM-3-10) and Quality Assurance (WHC-CM-4-2, QR 19.0) policies

  12. A software for computer automated radioactive particle tracking

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Wilson S.; Brandao, Luis E. [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)]. E-mails: wilson@ien.gov.br; brandao@ien.gov.br; Braz, Delson [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE)]. E-mail: delson@smb.lin.ufrj.br

    2008-07-01

    TRACO-1 is the first software developed in Brazil for optimization and diagnosis of multiphase chemical reactors employing the technique known as 'Computer Automated Radioactive Particle Tracking' whose main idea is to follow the movement of a punctual radioactive particle inside a vessel. Considering that this particle has a behavior similar of the phase under investigation, important conclusions can be achieved. As a preliminary TRACO-1 evaluation, a simulation was carried out with the aid of a commercial software called MICROSHIELD, version 5.05, to obtain values of photon counting rates at four detector surfaces. These counting were related to the emission of gamma radiation from a radioactive source because they are the main TRACO-1 input variables. Although the results that has been found are incipient, the analysis of them suggest that the tracking of a radioactive source using TRACO- 1 can be well succeed, but a better evaluation of the capabilities of this software will only be achieved after its application in real experiments. (author)

  13. High-integrity software, computation and the scientific method

    International Nuclear Information System (INIS)

    Hatton, L.

    2012-01-01

    Computation rightly occupies a central role in modern science. Datasets are enormous and the processing implications of some algorithms are equally staggering. With the continuing difficulties in quantifying the results of complex computations, it is of increasing importance to understand its role in the essentially Popperian scientific method. In this paper, some of the problems with computation, for example the long-term unquantifiable presence of undiscovered defect, problems with programming languages and process issues will be explored with numerous examples. One of the aims of the paper is to understand the implications of trying to produce high-integrity software and the limitations which still exist. Unfortunately Computer Science itself suffers from an inability to be suitably critical of its practices and has operated in a largely measurement-free vacuum since its earliest days. Within computer science itself, this has not been so damaging in that it simply leads to unconstrained creativity and a rapid turnover of new technologies. In the applied sciences however which have to depend on computational results, such unquantifiability significantly undermines trust. It is time this particular demon was put to rest. (author)

  14. Software Tools: A One-Semester Secondary School Computer Course.

    Science.gov (United States)

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  15. The role of colposcopy and typization of human papillomavirus in further diagnostic proceedings in patients with ASC-US cytological finding of the uterine cervix

    Directory of Open Access Journals (Sweden)

    Živadinović Radomir

    2009-01-01

    Full Text Available Background/Aim. Bethesda system of classification of cytological findings was introduced in 2001 two subcategories in the category of atypical squamous cells (ASC findings: ASC of undetermined significance (ASC-US and ASC which cannot exclude high-grade intraepithelial lesions (ASC-H. The aim of our study was to assess a possible association of these two subcategories with pathologic biopsy finding and to find out the best further diagnostic proceedings. Methods. At the Clinic of Gynecology and Obstetrics, Niš 130 patients with ASC findings were analyzed. Colposcopy was performed in all study participants. Patients with pathological colposcopic findings underwent cervical biopsy. In 10 patients with pathologic histologic and 15 with benign findings human papilloma virus (HPV typization was done using the Hybrid Capture method. Results. Patients with ASC-H finding had significantly more pathologic biopsies compared with patients with ASC-US finding (57.84: 20.72. Conclusion. Colposcopy was exhibited somewhat higher sensitivity compared to HPV typization (94.7 : 90, but lower sensitivity (79.27 : 86.6. The usage of HPV typization in the triage of patients with ASC cytologic smear induces statistically significant reduction of unnecessary percentage of cervical biopsies.

  16. Antidepressant-Like Effects of Central BDNF Administration in Mice of Antidepressant Sensitive Catalepsy (ASC) Strain.

    Science.gov (United States)

    Tikhonova, Maria; Kulikov, Alexander V

    2012-08-31

    Although numerous data evidence the implication of brain-derived neurotrophic factor (BDNF) in the pathophysiology of depression, the potential for BDNF to correct genetically defined depressive-like states is poorly studied. This study was aimed to reveal antidepressant-like effects of BDNF (300 ng, 2×, i.c.v.) on behavior and mRNA expression of genes associated with depression-like state in the brain in mice of antidepressant sensitive catalepsy (ASC) strain characterized by high hereditary predisposition to catalepsy and depressive-like features. Behavioral tests were held on the 7th-16th days after the first (4th-13th after the second) BDNF injection. Results showed that BDNF normalized impaired sexual motivation in the ASC males, and this BDNF effect differed, with advantageous effects, from that of widely used antidepressants. The anticataleptic effect of two BDNF injections was enhanced compared with a single administration. A tendency to decrease the immobility duration in tail-suspension test was observed in BDNF-treated ASC mice. The effects on catalepsy and sexual motivation were specific since BDNF did not alter locomotor and exploratory activity or social interest in the ASC mice. Along with behavioral antidepressant-like effects on the ASC mice, BDNF increased hippocampal mRNA levels of Bdnf and Creb1 (cAMP response element-binding protein gene). BDNF also augmented mRNA levels of Arc gene encoding Arc (Activity-regulated cytoskeleton-associated) protein involved in BDNF-induced processes of neuronal and synaptic plasticity in hippocampus and prefrontal cortex. The data suggest that: [1] BDNF is effective in the treatment of some genetically defined behavioral disturbances; [2] BDNF influences sexually-motivated behavior; [3] Arc mRNA levels may serve as a molecular marker of BDNF physiological activity associated with its long-lasting behavioral effects; [4] ASC mouse strain can be used as a suitable model to study mechanisms of BDNF effects on

  17. Mahotas: Open source software for scriptable computer vision

    Directory of Open Access Journals (Sweden)

    Luis Pedro Coelho

    2013-07-01

    Full Text Available Mahotas is a computer vision library for Python. It contains traditional image processing functionality such as filtering and morphological operations as well as more modern computer vision functions for feature computation, including interest point detection and local descriptors. The interface is in Python, a dynamic programming language, which is appropriate for fast development, but the algorithms are implemented in C++ and are tuned for speed. The library is designed to fit in with the scientific software ecosystem in this language and can leverage the existing infrastructure developed in that language. Mahotas is released under a liberal open source license (MIT License and is available from http://github.com/luispedro/mahotas and from the Python Package Index (http://pypi.python.org/pypi/mahotas. Tutorials and full API documentation are available online at http://mahotas.readthedocs.org/.

  18. ASC Trilab L2 Codesign Milestone 2015

    Energy Technology Data Exchange (ETDEWEB)

    Trott, Christian Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hammond, Simon David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinge, Dennis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lin, Paul T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vaughan, Courtenay T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cook, Jeanine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Edwards, Harold C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rajan, Mahesh [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoekstra, Robert J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    For the FY15 ASC L2 Trilab Codesign milestone Sandia National Laboratories performed two main studies. The first study investigated three topics (performance, cross-platform portability and programmer productivity) when using OpenMP directives and the RAJA and Kokkos programming models available from LLNL and SNL respectively. The focus of this first study was the LULESH mini-application developed and maintained by LLNL. In the coming sections of the report the reader will find performance comparisons (and a demonstration of portability) for a variety of mini-application implementations produced during this study with varying levels of optimization. Of note is that the implementations utilized including optimizations across a number of programming models to help ensure claims that Kokkos can provide native-class application performance are valid. The second study performed during FY15 is a performance assessment of the MiniAero mini-application developed by Sandia. This mini-application was developed by the SIERRA Thermal-Fluid team at Sandia for the purposes of learning the Kokkos programming model and so is available in only a single implementation. For this report we studied its performance and scaling on a number of machines with the intent of providing insight into potential performance issues that may be experienced when similar algorithms are deployed on the forthcoming Trinity ASC ATS platform.

  19. Software development on the DIII-D control and data acquisition computers

    International Nuclear Information System (INIS)

    Penaflor, B.G.; McHarg, B.B. Jr.; Piglowski, D.

    1997-11-01

    The various software systems developed for the DIII-D tokamak have played a highly visible and important role in tokamak operations and fusion research. Because of the heavy reliance on in-house developed software encompassing all aspects of operating the tokamak, much attention has been given to the careful design, development and maintenance of these software systems. Software systems responsible for tokamak control and monitoring, neutral beam injection, and data acquisition demand the highest level of reliability during plasma operations. These systems made up of hundreds of programs totaling thousands of lines of code have presented a wide variety of software design and development issues ranging from low level hardware communications, database management, and distributed process control, to man machine interfaces. The focus of this paper will be to describe how software is developed and managed for the DIII-D control and data acquisition computers. It will include an overview and status of software systems implemented for tokamak control, neutral beam control, and data acquisition. The issues and challenges faced developing and managing the large amounts of software in support of the dynamic and everchanging needs of the DIII-D experimental program will be addressed

  20. Development of innovative computer software to facilitate the setup and computation of water quality index.

    Science.gov (United States)

    Nabizadeh, Ramin; Valadi Amin, Maryam; Alimohammadi, Mahmood; Naddafi, Kazem; Mahvi, Amir Hossein; Yousefzadeh, Samira

    2013-04-26

    Developing a water quality index which is used to convert the water quality dataset into a single number is the most important task of most water quality monitoring programmes. As the water quality index setup is based on different local obstacles, it is not feasible to introduce a definite water quality index to reveal the water quality level. In this study, an innovative software application, the Iranian Water Quality Index Software (IWQIS), is presented in order to facilitate calculation of a water quality index based on dynamic weight factors, which will help users to compute the water quality index in cases where some parameters are missing from the datasets. A dataset containing 735 water samples of drinking water quality in different parts of the country was used to show the performance of this software using different criteria parameters. The software proved to be an efficient tool to facilitate the setup of water quality indices based on flexible use of variables and water quality databases.

  1. Computational Ecology and Software (http://www.iaees.org/publications/journals/ces/online-version.asp

    Directory of Open Access Journals (Sweden)

    ces@iaees.org

    Full Text Available Computational Ecology and Software ISSN 2220-721X URL: http://www.iaees.org/publications/journals/ces/online-version.asp RSS: http://www.iaees.org/publications/journals/ces/rss.xml E-mail: ces@iaees.org Editor-in-Chief: WenJun Zhang Aims and Scope COMPUTATIONAL ECOLOGY AND SOFTWARE (ISSN 2220-721X is an open access, peer-reviewed online journal that considers scientific articles in all different areas of computational ecology. It is the transactions of the International Society of Computational Ecology. The journal is concerned with the ecological researches, constructions and applications of theories and methods of computational sciences including computational mathematics, computational statistics and computer science. It features the simulation, approximation, prediction, recognition, and classification of ecological issues. Intensive computation is one of the major stresses of the journal. The journal welcomes research articles, short communications, review articles, perspectives, and book reviews. The journal also supports the activities of the International Society of Computational Ecology. The topics to be covered by CES include, but are not limited to: •Computation intensive methods, numerical and optimization methods, differential and difference equation modeling and simulation, prediction, recognition, classification, statistical computation (Bayesian computing, randomization, bootstrapping, Monte Carlo techniques, stochastic process, etc., agent-based modeling, individual-based modeling, artificial neural networks, knowledge based systems, machine learning, genetic algorithms, data exploration, network analysis and computation, databases, ecological modeling and computation using Geographical Information Systems, satellite imagery, and other computation intensive theories and methods. •Artificial ecosystems, artificial life, complexity of ecosystems and virtual reality. •The development, evaluation and validation of software and

  2. NLRP3 and ASC suppress lupus-like autoimmunity by driving the immunosuppressive effects of TGF-β receptor signalling.

    Science.gov (United States)

    Lech, Maciej; Lorenz, Georg; Kulkarni, Onkar P; Grosser, Marian O O; Stigrot, Nora; Darisipudi, Murthy N; Günthner, Roman; Wintergerst, Maximilian W M; Anz, David; Susanti, Heni Eka; Anders, Hans-Joachim

    2015-12-01

    The NLRP3/ASC inflammasome drives host defence and autoinflammatory disorders by activating caspase-1 to trigger the secretion of mature interleukin (IL)-1β/IL-18, but its potential role in autoimmunity is speculative. We generated and phenotyped Nlrp3-deficient, Asc-deficient, Il-1r-deficient and Il-18-deficient C57BL/6-lpr/lpr mice, the latter being a mild model of spontaneous lupus-like autoimmunity. While lack of IL-1R or IL-18 did not affect the C57BL/6-lpr/lpr phenotype, lack of NLRP3 or ASC triggered massive lymphoproliferation, lung T cell infiltrates and severe proliferative lupus nephritis within 6 months, which were all absent in age-matched C57BL/6-lpr/lpr controls. Lack of NLRP3 or ASC increased dendritic cell and macrophage activation, the expression of numerous proinflammatory mediators, lymphocyte necrosis and the expansion of most T cell and B cell subsets. In contrast, plasma cells and autoantibody production were hardly affected. This unexpected immunosuppressive effect of NLRP3 and ASC may relate to their known role in SMAD2/3 phosphorylation during tumour growth factor (TGF)-β receptor signalling, for example, Nlrp3-deficiency and Asc-deficiency significantly suppressed the expression of numerous TGF-β target genes in C57BL/6-lpr/lpr mice and partially recapitulated the known autoimmune phenotype of Tgf-β1-deficient mice. These data identify a novel non-canonical immunoregulatory function of NLRP3 and ASC in autoimmunity. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  3. Primary Health Care Software-A Computer Based Data Management System

    Directory of Open Access Journals (Sweden)

    Tuli K

    1990-01-01

    Full Text Available Realising the duplication and time consumption in the usual manual system of data collection necessitated experimentation with computer based management system for primary health care in the primary health centers. The details of the population as available in the existing manual system were used for computerizing the data. Software was designed for data entry and analysis. It was written in Dbase III plus language. It was so designed that a person with no knowledge about computer could use it, A cost analysis was done and the computer system was found more cost effective than the usual manual system.

  4. Development of the JFT-2M data analysis software system on the mainframe computer

    International Nuclear Information System (INIS)

    Matsuda, Toshiaki; Amagai, Akira; Suda, Shuji; Maemura, Katsumi; Hata, Ken-ichiro.

    1990-11-01

    We developed software system on the FACOM mainframe computer to analyze JFT-2M experimental data archived by JFT-2M data acquisition system. Then we can reduce and distribute the CPU load of the data acquisition system. And we can analyze JFT-2M experimental data by using complicated computational code with raw data, such as equilibrium calculation and transport analysis, and useful software package like SAS statistic package on the mainframe. (author)

  5. Computer mapping software and geographic data base development: Oak Ridge National Laboratory user experience

    International Nuclear Information System (INIS)

    Honea, B.; Johnson, P.

    1978-01-01

    As users of computer display tools, our opinion is that the researcher's needs should guide and direct the computer scientist's development of mapping software and data bases. Computer graphic techniques developed for the sake of the computer graphics community tend to be esoteric and rarely suitable for user problems. Two types of users exist for computer graphic tools: the researcher who is generally satisfied with abstract but accurate displays for analysis purposes and the decision maker who requires synoptic and easily comprehended displays relevant to the issues he or she must address. Computer mapping software and data bases should be developed for the user in a generalized and standardized format for ease in transferring and to facilitate the linking or merging with larger analysis systems. Maximum utility of computer mapping tools is accomplished when linked to geographic information and analysis systems. Computer graphic techniques have varying degrees of utility depending upon whether they are used for data validation, analysis procedures or presenting research results

  6. Computer- Aided Design in Power Engineering Application of Software Tools

    CERN Document Server

    Stojkovic, Zlatan

    2012-01-01

    This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents  application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel & Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems ...

  7. Computer software configuration management

    International Nuclear Information System (INIS)

    Pelletier, G.

    1987-08-01

    This report reviews the basic elements of software configuration management (SCM) as defined by military and industry standards. Several software configuration management standards are evaluated given the requirements of the nuclear industry. A survey is included of available automated tools for supporting SCM activities. Some information is given on the experience of establishing and using SCM plans of other organizations that manage critical software. The report concludes with recommendations of practices that would be most appropriate for the nuclear power industry in Canada

  8. Computer-aided design in power engineering. Application of software tools

    International Nuclear Information System (INIS)

    Stojkovic, Zlatan

    2012-01-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  9. Computer-aided design in power engineering. Application of software tools

    Energy Technology Data Exchange (ETDEWEB)

    Stojkovic, Zlatan

    2012-07-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  10. ATLAS experience with HEP software at the Argonne leadership computing facility

    International Nuclear Information System (INIS)

    Uram, Thomas D; LeCompte, Thomas J; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  11. ATLAS Experience with HEP Software at the Argonne Leadership Computing Facility

    CERN Document Server

    LeCompte, T; The ATLAS collaboration; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  12. Designing of a Computer Software for Detection of Approximal Caries in Posterior Teeth

    International Nuclear Information System (INIS)

    Valizadeh, Solmaz; Goodini, Mostafa; Ehsani, Sara; Mohseni, Hadis; Azimi, Fateme; Bakhshandeh, Hooman

    2015-01-01

    Radiographs, adjunct to clinical examination are always valuable complementary methods for dental caries detection. Recently, progressing in digital imaging system provides possibility of software designing for automatically dental caries detection. The aim of this study was to develop and assess the function of diagnostic computer software designed for evaluation of approximal caries in posterior teeth. This software should be able to indicate the depth and location of caries on digital radiographic images. Digital radiographs were obtained of 93 teeth including 183 proximal surfaces. These images were used as a database for designing the software and training the software designer. In the design phase, considering the summed density of pixels in rows and columns of the images, the teeth were separated from each other and the unnecessary regions; for example, the root area in the alveolar bone was eliminated. Therefore, based on summed intensities, each image was segmented such that each segment contained only one tooth. Subsequently, based on the fuzzy logic, a well-known data-clustering algorithm named fuzzy c-means (FCM) was applied to the images to cluster or segment each tooth. This algorithm is referred to as a soft clustering method, which assigns data elements to one or more clusters with a specific membership function. Using the extracted clusters, the tooth border was determined and assessed for cavity. The results of histological analysis were used as the gold standard for comparison with the results obtained from the software. Depth of caries was measured, and finally Intraclass Correlation Coefficient (ICC) and Bland-Altman plot were used to show the agreement between the methods. The software diagnosed 60% of enamel caries. The ICC (for detection of enamel caries) between the computer software and histological analysis results was determined as 0.609 (95% confidence interval [CI] = 0.159-0.849) (P = 0.006). Also, the computer program diagnosed 97% of

  13. Special software for computing the special functions of wave catastrophes

    Directory of Open Access Journals (Sweden)

    Andrey S. Kryukovsky

    2015-01-01

    Full Text Available The method of ordinary differential equations in the context of calculating the special functions of wave catastrophes is considered. Complementary numerical methods and algorithms are described. The paper shows approaches to accelerate such calculations using capabilities of modern computing systems. Methods for calculating the special functions of wave catastrophes are considered in the framework of parallel computing and distributed systems. The paper covers the development process of special software for calculating of special functions, questions of portability, extensibility and interoperability.

  14. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  15. Aeroacoustics research in Europe : the CEAS-ASC report on 2007 highlights

    NARCIS (Netherlands)

    Brouwer, H.H.; Rienstra, S.W.

    2008-01-01

    The Council of European Aerospace Societies (CEAS) Aeroacoustics Specialists Committee (ASC) supports and promotes the interests of the scientific and industrial aeroacoustics community on a European scale and European aeronautics activities internationally. In this context, "aeroacoustics"

  16. Computer-aided software development

    International Nuclear Information System (INIS)

    Teichroew, D.; Hershey, E.A. III; Yamamoto, Y.

    1978-01-01

    In recent years, as the hardware cost/capability ratio has continued to decrease and as much of the routine data processing has been computerized, the emphasis in software development has shifted from just getting systems operational to the maintenance of existing systems, reduction of duplication by integration, selective addition of new applications, systems that are more usable, maintainable, portable and reliable and to improving the productivity of software developers. This paper examines a number of trends that are changing the methods by which software is being produced and used. (Auth.)

  17. Efficient multi-objective calibration of a computationally intensive hydrologic model with parallel computing software in Python

    Science.gov (United States)

    With enhanced data availability, distributed watershed models for large areas with high spatial and temporal resolution are increasingly used to understand water budgets and examine effects of human activities and climate change/variability on water resources. Developing parallel computing software...

  18. SU-F-I-43: A Software-Based Statistical Method to Compute Low Contrast Detectability in Computed Tomography Images

    Energy Technology Data Exchange (ETDEWEB)

    Chacko, M; Aldoohan, S [University of Oklahoma Health Sciences Center, Oklahoma City, OK (United States)

    2016-06-15

    Purpose: The low contrast detectability (LCD) of a CT scanner is its ability to detect and display faint lesions. The current approach to quantify LCD is achieved using vendor-specific methods and phantoms, typically by subjectively observing the smallest size object at a contrast level above phantom background. However, this approach does not yield clinically applicable values for LCD. The current study proposes a statistical LCD metric using software tools to not only to assess scanner performance, but also to quantify the key factors affecting LCD. This approach was developed using uniform QC phantoms, and its applicability was then extended under simulated clinical conditions. Methods: MATLAB software was developed to compute LCD using a uniform image of a QC phantom. For a given virtual object size, the software randomly samples the image within a selected area, and uses statistical analysis based on Student’s t-distribution to compute the LCD as the minimal Hounsfield Unit’s that can be distinguished from the background at the 95% confidence level. Its validity was assessed by comparison with the behavior of a known QC phantom under various scan protocols and a tissue-mimicking phantom. The contributions of beam quality and scattered radiation upon the computed LCD were quantified by using various external beam-hardening filters and phantom lengths. Results: As expected, the LCD was inversely related to object size under all scan conditions. The type of image reconstruction kernel filter and tissue/organ type strongly influenced the background noise characteristics and therefore, the computed LCD for the associated image. Conclusion: The proposed metric and its associated software tools are vendor-independent and can be used to analyze any LCD scanner performance. Furthermore, the method employed can be used in conjunction with the relationships established in this study between LCD and tissue type to extend these concepts to patients’ clinical CT

  19. Analysis of chromium-51 release assay data using personal computer spreadsheet software

    International Nuclear Information System (INIS)

    Lefor, A.T.; Steinberg, S.M.; Wiebke, E.A.

    1988-01-01

    The Chromium-51 release assay is a widely used technique to assess the lysis of labeled target cells in vitro. We have developed a simple technique to analyze data from Chromium-51 release assays using the widely available LOTUS 1-2-3 spreadsheet software. This package calculates percentage specific cytotoxicity and lytic units by linear regression. It uses all data points to compute the linear regression and can determine if there is a statistically significant difference between two lysis curves. The system is simple to use and easily modified, since its implementation requires neither knowledge of computer programming nor custom designed software. This package can help save considerable time when analyzing data from Chromium-51 release assays

  20. Cluster implementation for parallel computation within MATLAB software environment

    International Nuclear Information System (INIS)

    Santana, Antonio O. de; Dantas, Carlos C.; Charamba, Luiz G. da R.; Souza Neto, Wilson F. de; Melo, Silvio B. Melo; Lima, Emerson A. de O.

    2013-01-01

    A cluster for parallel computation with MATLAB software the COCGT - Cluster for Optimizing Computing in Gamma ray Transmission methods, is implemented. The implementation correspond to creation of a local net of computers, facilities and configurations of software, as well as the accomplishment of cluster tests for determine and optimizing of performance in the data processing. The COCGT implementation was required by data computation from gamma transmission measurements applied to fluid dynamic and tomography reconstruction in a FCC-Fluid Catalytic Cracking cold pilot unity, and simulation data as well. As an initial test the determination of SVD - Singular Values Decomposition - of random matrix with dimension (n , n), n=1000, using the Girco's law modified, revealed that COCGT was faster in comparison to the literature [1] cluster, which is similar and operates at the same conditions. Solution of a system of linear equations provided a new test for the COCGT performance by processing a square matrix with n=10000, computing time was 27 s and for square matrix with n=12000, computation time was 45 s. For determination of the cluster behavior in relation to 'parfor' (parallel for-loop) and 'spmd' (single program multiple data), two codes were used containing those two commands and the same problem: determination of SVD of a square matrix with n= 1000. The execution of codes by means of COCGT proved: 1) for the code with 'parfor', the performance improved with the labs number from 1 to 8 labs; 2) for the code 'spmd', just 1 lab (core) was enough to process and give results in less than 1 s. In similar situation, with the difference that now the SVD will be determined from square matrix with n1500, for code with 'parfor', and n=7000, for code with 'spmd'. That results take to conclusions: 1) for the code with 'parfor', the behavior was the same already described above; 2) for code with 'spmd', the same besides having produced a larger performance, it supports a

  1. CRYSNET manual. Informal report. [Hardware and software of crystallographic computing network

    Energy Technology Data Exchange (ETDEWEB)

    None,

    1976-07-01

    This manual describes the hardware and software which together make up the crystallographic computing network (CRYSNET). The manual is intended as a users' guide and also provides general information for persons without any experience with the system. CRYSNET is a network of intelligent remote graphics terminals that are used to communicate with the CDC Cyber 70/76 computing system at the Brookhaven National Laboratory (BNL) Central Scientific Computing Facility. Terminals are in active use by four research groups in the field of crystallography. A protein data bank has been established at BNL to store in machine-readable form atomic coordinates and other crystallographic data for macromolecules. The bank currently includes data for more than 20 proteins. This structural information can be accessed at BNL directly by the CRYSNET graphics terminals. More than two years of experience has been accumulated with CRYSNET. During this period, it has been demonstrated that the terminals, which provide access to a large, fast third-generation computer, plus stand-alone interactive graphics capability, are useful for computations in crystallography, and in a variety of other applications as well. The terminal hardware, the actual operations of the terminals, and the operations of the BNL Central Facility are described in some detail, and documentation of the terminal and central-site software is given. (RWR)

  2. Characterization of a novel variant of amino acid transport system asc in erythrocytes from Przewalski's horse (Equus przewalskii).

    Science.gov (United States)

    Fincham, D A; Ellory, J C; Young, J D

    1992-08-01

    In thoroughbred horses, red blood cell amino acid transport activity is Na(+)-independent and controlled by three codominant genetic alleles (h, l, s), coding for high-affinity system asc1 (L-alanine apparent Km for influx at 37 degrees C congruent to 0.35 mM), low-affinity system asc2 (L-alanine Km congruent to 14 mM), and transport deficiency, respectively. The present study investigated amino acid transport mechanisms in red cells from four wild species: Przewalski's horse (Equus przewalskii), Hartmann's zebra (Zebra hartmannae), Grevy's zebra (Zebra grevyi), and onager (Equus hemonius). Red blood cell samples from different Przewalski's horses exhibited uniformly high rates of L-alanine uptake, mediated by a high-affinity asc1-type transport system. Mean apparent Km and Vmax values (+/- SE) for L-alanine influx at 37 degrees C in red cells from 10 individual animals were 0.373 +/- 0.068 mM and 2.27 +/- 0.11 mmol (L cells.h), respectively. As in thoroughbreds, the Przewalski's horse transporter interacted with dibasic as well as neutral amino acids. However, the Przewalski asc1 isoform transported L-lysine with a substantially (6.4-fold) higher apparent affinity than its thoroughbred counterpart (Km for influx 1.4 mM at 37 degrees C) and was also less prone to trans-stimulation effects. The novel high apparent affinity of the Przewalski's horse transporter for L-lysine provides additional key evidence of functional and possible structural similarities between asc and the classical Na(+)-dependent system ASC and between these systems and the Na(+)-independent dibasic amino acid transport system y+. Unlike Przewalski's horse, zebra red cells were polymorphic with respect to L-alanine transport activity, showing high-affinity or low-affinity saturable mechanisms of L-alanine uptake. Onager red cells transported this amino acid with intermediate affinity (apparent Km for influx 3.0 mM at 37 degrees C). Radiation inactivation analysis was used to estimate the target

  3. Factors Affecting Innovation Within Aeronautical Systems Center (ASC) Organizations - An inductive Study

    National Research Council Canada - National Science Library

    Feil, Eric

    2003-01-01

    .... This thesis analyzed data collected during the 2002 Chief of Staff of the Air Force Organizational Climate Survey to identify factors that affect innovation within Aeronautical Systems Center (ASC) organizations...

  4. COMPUTATIONAL MODELLING OF BUFFETING EFFECTS USING OPENFOAM SOFTWARE PACKAGE

    Directory of Open Access Journals (Sweden)

    V. T. Kalugin

    2015-01-01

    Full Text Available In this paper, the preliminary results of computational modeling of an aircraft with the airbrake deployed are presented. The calculations were performed with OpenFOAM software package. The results outlined are a part of a research project to optimise aircraft performance using a perforated airbrake. Within this stage of the project OpenFOAM software package with hybrid RANS-LES approach was tested in respect to a given configuration of the aircraft, airbrake and then has been compared with the test data. For the worst case the amplitude of the peak force acting on the tail fin can be up to 6 times higher than the average value without airbrake deployed. To reduce unsteady loads acting on the tailfin, perforation of the airbrake was proposed.

  5. Profiling Autism Symptomatology: An Exploration of the Q-ASC Parental Report Scale in Capturing Sex Differences in Autism

    Science.gov (United States)

    Ormond, Sarah; Brownlow, Charlotte; Garnett, Michelle Sarah; Rynkiewicz, Agnieszka; Attwood, Tony

    2018-01-01

    The Questionnaire for Autism Spectrum Conditions (Q-ASC) was developed by Attwood et al. (2011) to identify gender-sensitive profiles of autism symptomatology; prioritise and adjust the direction of clinical interventions; and support positive psychosocial outcomes and prognosis into adulthood. The current research piloted the Q-ASC with parents…

  6. Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Vigil,Benny Manuel [Los Alamos National Laboratory; Ballance, Robert [SNL; Haskell, Karen [SNL

    2012-08-09

    Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.

  7. 77 FR 50720 - Test Documentation for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2012-08-22

    ... Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission. ACTION: Draft... Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1207 is proposed Revision 1 of... for Digital Computer Software Used in Safety Systems of Nuclear Power Plants'' is temporarily...

  8. A specialized plug-in software module for computer-aided quantitative measurement of medical images.

    Science.gov (United States)

    Wang, Q; Zeng, Y J; Huo, P; Hu, J L; Zhang, J H

    2003-12-01

    This paper presents a specialized system for quantitative measurement of medical images. Using Visual C++, we developed a computer-aided software based on Image-Pro Plus (IPP), a software development platform. When transferred to the hard disk of a computer by an MVPCI-V3A frame grabber, medical images can be automatically processed by our own IPP plug-in for immunohistochemical analysis, cytomorphological measurement and blood vessel segmentation. In 34 clinical studies, the system has shown its high stability, reliability and ease of utility.

  9. Aeroacoustics research in Europe : the CEAS-ASC report on 1997 highlights

    NARCIS (Netherlands)

    Rienstra, S.W.

    1998-01-01

    This paper is a report on the highlights of aeroacoustics research and development in Europe in 1997, compiled from information provided in the CEAS Aeroacoustics Specialists Committee (ASC). The Confederation of European Aerospace Societies (CEAS) comprises the national Aerospace Societies of

  10. Features of commercial computer software systems for medical examiners and coroners.

    Science.gov (United States)

    Hanzlick, R L; Parrish, R G; Ing, R

    1993-12-01

    There are many ways of automating medical examiner and coroner offices, one of which is to purchase commercial software products specifically designed for death investigation. We surveyed four companies that offer such products and requested information regarding each company and its hardware, software, operating systems, peripheral devices, applications, networking options, programming language, querying capability, coding systems, prices, customer support, and number and size of offices using the product. Although the four products (CME2, ForenCIS, InQuest, and Medical Examiner's Software System) are similar in many respects and each can be installed on personal computers, there are differences among the products with regard to cost, applications, and the other features. Death investigators interested in office automation should explore these products to determine the usefulness of each in comparison with the others and in comparison with general-purpose, off-the-shelf databases and software adaptable to death investigation needs.

  11. Human adipose stem cell and ASC-derived cardiac progenitor cellular therapy improves outcomes in a murine model of myocardial infarction

    Directory of Open Access Journals (Sweden)

    Davy PMC

    2015-10-01

    Full Text Available Philip MC Davy,1 Kevin D Lye,2,3 Juanita Mathews,1 Jesse B Owens,1 Alice Y Chow,1 Livingston Wong,2 Stefan Moisyadi,1 Richard C Allsopp1 1Institute for Biogenesis Research, 2John A. Burns School of Medicine, University of Hawaii at Mānoa, 3Tissue Genesis, Inc., Honolulu, HI, USA Background: Adipose tissue is an abundant and potent source of adult stem cells for transplant therapy. In this study, we present our findings on the potential application of adipose-derived stem cells (ASCs as well as induced cardiac-like progenitors (iCPs derived from ASCs for the treatment of myocardial infarction. Methods and results: Human bone marrow (BM-derived stem cells, ASCs, and iCPs generated from ASCs using three defined cardiac lineage transcription factors were assessed in an immune-compromised mouse myocardial infarction model. Analysis of iCP prior to transplant confirmed changes in gene and protein expression consistent with a cardiac phenotype. Endpoint analysis was performed 1 month posttransplant. Significantly increased endpoint fractional shortening, as well as reduction in the infarct area at risk, was observed in recipients of iCPs as compared to the other recipient cohorts. Both recipients of iCPs and ASCs presented higher myocardial capillary densities than either recipients of BM-derived stem cells or the control cohort. Furthermore, mice receiving iCPs had a significantly higher cardiac retention of transplanted cells than all other groups. Conclusion: Overall, iCPs generated from ASCs outperform BM-derived stem cells and ASCs in facilitating recovery from induced myocardial infarction in mice. Keywords: adipose stem cells, myocardial infarction, cellular reprogramming, cellular therapy, piggyBac, induced cardiac-like progenitors

  12. Right Size Determining the Staff Necessary to Sustain Simulation and Computing Capabilities for Nuclear Security

    Energy Technology Data Exchange (ETDEWEB)

    Nikkel, Daniel J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Meisner, Robert [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2010-09-10

    The Advanced Simulation and Computing Campaign, herein referred to as the ASC Program, is a core element of the science-based Stockpile Stewardship Program (SSP), which enables assessment, certification, and maintenance of the safety, security, and reliability of the U.S. nuclear stockpile without the need to resume nuclear testing. The use of advanced parallel computing has transitioned from proof-of-principle to become a critical element for assessing and certifying the stockpile. As the initiative phase of the ASC Program came to an end in the mid-2000s, the National Nuclear Security Administration redirected resources to other urgent priorities, and resulting staff reductions in ASC occurred without the benefit of analysis of the impact on modern stockpile stewardship that is dependent on these new simulation capabilities. Consequently, in mid-2008 the ASC Program management commissioned a study to estimate the essential size and balance needed to sustain advanced simulation as a core component of stockpile stewardship. The ASC Program requires a minimum base staff size of 930 (which includes the number of staff necessary to maintain critical technical disciplines as well as to execute required programmatic tasks) to sustain its essential ongoing role in stockpile stewardship.

  13. Improving Software Performance in the Compute Unified Device Architecture

    Directory of Open Access Journals (Sweden)

    Alexandru PIRJAN

    2010-01-01

    Full Text Available This paper analyzes several aspects regarding the improvement of software performance for applications written in the Compute Unified Device Architecture CUDA. We address an issue of great importance when programming a CUDA application: the Graphics Processing Unit’s (GPU’s memory management through ranspose ernels. We also benchmark and evaluate the performance for progressively optimizing a transposing matrix application in CUDA. One particular interest was to research how well the optimization techniques, applied to software application written in CUDA, scale to the latest generation of general-purpose graphic processors units (GPGPU, like the Fermi architecture implemented in the GTX480 and the previous architecture implemented in GTX280. Lately, there has been a lot of interest in the literature for this type of optimization analysis, but none of the works so far (to our best knowledge tried to validate if the optimizations can apply to a GPU from the latest Fermi architecture and how well does the Fermi architecture scale to these software performance improving techniques.

  14. A state-of-the-art report on software operation structure of the digital control computer system

    International Nuclear Information System (INIS)

    Kim, Bong Kee; Lee, Kyung Hoh; Joo, Jae Yoon; Jang, Yung Woo; Shin, Hyun Kook

    1994-06-01

    CANDU Nuclear Power Plants including Wolsong 1 and 2/3/4 are controlled by a real-time plant control computer system. This report was written to provide an overview on the station control computer software which belongs to one of the most advanced real-time computing application area, along with the Fuel Handling Machine design concepts. The combination of well designed control computer and Fuel Handling Machine allow changing fuel bundles while the plant is in operation. Design methodologies and software structure are discussed along with the interface between the two systems. 29 figs., 2 tabs., 20 refs. (Author)

  15. A state-of-the-art report on software operation structure of the digital control computer system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Bong Kee; Lee, Kyung Hoh; Joo, Jae Yoon; Jang, Yung Woo; Shin, Hyun Kook [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-06-01

    CANDU Nuclear Power Plants including Wolsong 1 and 2/3/4 are controlled by a real-time plant control computer system. This report was written to provide an overview on the station control computer software which belongs to one of the most advanced real-time computing application area, along with the Fuel Handling Machine design concepts. The combination of well designed control computer and Fuel Handling Machine allow changing fuel bundles while the plant is in operation. Design methodologies and software structure are discussed along with the interface between the two systems. 29 figs., 2 tabs., 20 refs. (Author).

  16. Computer software configuration management plan for 200 East/West Liquid Effluent Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Graf, F.A. Jr.

    1995-02-27

    This computer software management configuration plan covers the control of the software for the monitor and control system that operates the Effluent Treatment Facility and its associated truck load in station and some key aspects of the Liquid Effluent Retention Facility that stores condensate to be processed. Also controlled is the Treated Effluent Disposal System`s pumping stations and monitors waste generator flows in this system as well as the Phase Two Effluent Collection System.

  17. Computer software configuration management plan for 200 East/West Liquid Effluent Facilities

    International Nuclear Information System (INIS)

    Graf, F.A. Jr.

    1995-01-01

    This computer software management configuration plan covers the control of the software for the monitor and control system that operates the Effluent Treatment Facility and its associated truck load in station and some key aspects of the Liquid Effluent Retention Facility that stores condensate to be processed. Also controlled is the Treated Effluent Disposal System's pumping stations and monitors waste generator flows in this system as well as the Phase Two Effluent Collection System

  18. FREE SOFTWARE IN ELECTRONIC LEARNING FUTURE TEACHERS OF MATHEMATICS, PHYSICS AND COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Vladyslav Ye. Velychko

    2016-05-01

    Full Text Available Popularity of the use of free software in the IT industry is much higher than its popular use in educational activities. Disadvantages of free software and problems of its implementation in the educational process is a limiting factor for its use in the education system, however, openness, accessibility and functionality are the main factors for the introduction of free software in the educational process. Nevertheless, for future teachers of mathematics, physics and informatics free software is designed as well as possible because of the specificity of its creation, and therefore, there is a question of the system analysis of the possibilities of using open source software in e-learning for future teachers of mathematics, physics and computer science.

  19. Honeywell Modular Automation System Computer Software Documentation for the Magnesium Hydroxide Precipitation Process

    International Nuclear Information System (INIS)

    STUBBS, A.M.

    2001-01-01

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP) for the Magnesium Hydroxide Precipitation Process in Rm 230C/234-5Z. The magnesium hydroxide process control software Rev 0 is being updated to include control programming for a second hot plate. The process control programming was performed by the system administrator. Software testing for the additional hot plate was performed per PFP Job Control Work Package 2Z-00-1703. The software testing was verified by Quality Control to comply with OSD-Z-184-00044, Magnesium Hydroxide Precipitation Process

  20. A Multi-Time Scale Morphable Software Milieu for Polymorphous Computing Architectures (PCA) - Composable, Scalable Systems

    National Research Council Canada - National Science Library

    Skjellum, Anthony

    2004-01-01

    Polymorphous Computing Architectures (PCA) rapidly "morph" (reorganize) software and hardware configurations in order to achieve high performance on computation styles ranging from specialized streaming to general threaded applications...

  1. Software Attribution for Geoscience Applications in the Computational Infrastructure for Geodynamics

    Science.gov (United States)

    Hwang, L.; Dumit, J.; Fish, A.; Soito, L.; Kellogg, L. H.; Smith, M.

    2015-12-01

    Scientific software is largely developed by individual scientists and represents a significant intellectual contribution to the field. As the scientific culture and funding agencies move towards an expectation that software be open-source, there is a corresponding need for mechanisms to cite software, both to provide credit and recognition to developers, and to aid in discoverability of software and scientific reproducibility. We assess the geodynamic modeling community's current citation practices by examining more than 300 predominantly self-reported publications utilizing scientific software in the past 5 years that is available through the Computational Infrastructure for Geodynamics (CIG). Preliminary results indicate that authors cite and attribute software either through citing (in rank order) peer-reviewed scientific publications, a user's manual, and/or a paper describing the software code. Attributions maybe found directly in the text, in acknowledgements, in figure captions, or in footnotes. What is considered citable varies widely. Citations predominantly lack software version numbers or persistent identifiers to find the software package. Versioning may be implied through reference to a versioned user manual. Authors sometimes report code features used and whether they have modified the code. As an open-source community, CIG requests that researchers contribute their modifications to the repository. However, such modifications may not be contributed back to a repository code branch, decreasing the chances of discoverability and reproducibility. Survey results through CIG's Software Attribution for Geoscience Applications (SAGA) project suggest that lack of knowledge, tools, and workflows to cite codes are barriers to effectively implement the emerging citation norms. Generated on-demand attributions on software landing pages and a prototype extensible plug-in to automatically generate attributions in codes are the first steps towards reproducibility.

  2. Software Defects, Scientific Computation and the Scientific Method

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    Computation has rapidly grown in the last 50 years so that in many scientific areas it is the dominant partner in the practice of science. Unfortunately, unlike the experimental sciences, it does not adhere well to the principles of the scientific method as espoused by, for example, the philosopher Karl Popper. Such principles are built around the notions of deniability and reproducibility. Although much research effort has been spent on measuring the density of software defects, much less has been spent on the more difficult problem of measuring their effect on the output of a program. This talk explores these issues with numerous examples suggesting how this situation might be improved to match the demands of modern science. Finally it develops a theoretical model based on an amalgam of statistical mechanics and Hartley/Shannon information theory which suggests that software systems have strong implementation independent behaviour and supports the widely observed phenomenon that defects clust...

  3. Optimal reliability allocation for large software projects through soft computing techniques

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albeanu, Grigore; Popentiu-Vladicescu, Florin

    2012-01-01

    or maximizing the system reliability subject to budget constraints. These kinds of optimization problems were considered both in deterministic and stochastic frameworks in literature. Recently, the intuitionistic-fuzzy optimization approach was considered as a soft computing successful modelling approach....... Firstly, a review on existing soft computing approaches to optimization is given. The main section extends the results considering self-organizing migrating algorithms for solving intuitionistic-fuzzy optimization problems attached to complex fault-tolerant software architectures which proved...

  4. A Middleware Platform for Providing Mobile and Embedded Computing Instruction to Software Engineering Students

    Science.gov (United States)

    Mattmann, C. A.; Medvidovic, N.; Malek, S.; Edwards, G.; Banerjee, S.

    2012-01-01

    As embedded software systems have grown in number, complexity, and importance in the modern world, a corresponding need to teach computer science students how to effectively engineer such systems has arisen. Embedded software systems, such as those that control cell phones, aircraft, and medical equipment, are subject to requirements and…

  5. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    Energy Technology Data Exchange (ETDEWEB)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.; McNamara, Laura A.; Trucano, Timothy Guy

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model

  6. Computer software configuration management plan for the Honeywell modular automation system

    International Nuclear Information System (INIS)

    Cunningham, L.T.

    1997-01-01

    This document provides a Computer Software management plan for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This type of system will be used to control new thermal stabilization furnaces, a vertical denitrator calciner, and a pyrolysis furnace

  7. The manual of a computer software 'FBR Plant Planning Design Prototype System'

    International Nuclear Information System (INIS)

    2003-10-01

    This is a manual of a computer software 'FBR Plant Planning Design Prototype System', which enables users to conduct case studies of deviated FBR design concepts based on 'MONJU'. The calculations simply proceed as the user clicks displayed buttons, therefore step-by-step explanation is supposed not be necessary. The following pages introduce only particular features of this software, i.e, each interactive screens, functions of buttons and consequences after clicks, and the quitting procedure. (author)

  8. Computational Science And Engineering Software Sustainability And Productivity (CSESSP) Challenges Workshop Report

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This report details the challenges and opportunities discussed at the NITRD sponsored multi-agency workshop on Computational Science and Engineering Software...

  9. Computer-Aided Prototyping Systems (CAPS) within the software acquisition process: a case study

    OpenAIRE

    Ellis, Mary Kay

    1993-01-01

    Approved for public release; distribution is unlimited This thesis provides a case study which examines the benefits derived from the practice of computer-aided prototyping within the software acquisition process. An experimental prototyping systems currently in research is the Computer Aided Prototyping System (CAPS) managed under the Computer Science department of the Naval Postgraduate School, Monterey, California. This thesis determines the qualitative value which may be realized by ...

  10. USERDA computer software summaries: numbers 240 through 324

    International Nuclear Information System (INIS)

    1976-12-01

    Since 1960 the Argonne Code Center has served as a U.S. Atomic Energy Commission information center for computer programs developed and used primarily for the solution of problems in nuclear physics, reactor design, reactor engineering and operation. The Center, through a network of registered installations, collects, validates, maintains, and distributes a library of these computer programs and publishes a compilation of abstracts describing them. In 1972 the scope of the Center's activities was officially expanded to include computer programs developed in all of the U.S. Atomic Energy Commission program areas and the compilation and publicatuon of this report. The Computer Software Summary report contains summaries of computer programs at the specification stage, under development, being checked out, in use, or available at ERDA offices, laboratories, and contractor installations. Programs are divided into the following categories : cross section and resonance integral calculations; spectrum calculations, generation of group constants, lattice and cell problems; static design studies; depletion, fuel management, cost analysis, and reactor economics; space-independent k;inetics; pace--time kinetics, coupled neutronics--hydrodynamics--thermodynmics and excursion simulations; radiological safety, hazard and accident analysis; heat transfer and fluid flow; deformation and stress distribution computations, structural analysis and engineering design studies; gamma heating and shielddesign programs; reactor systems analysis; data preparation; data management; subsidiary calculations; experimental data processing; general mathematical and computing system routines; materials; environmental and earth sciences; space sciences; electronics and engineering equipment; chemistry; particle accelerators and high-voltage machines; physics; controlled thermonuclear research; biology and medicine; and data

  11. Repair of Achilles tendon defect with autologous ASCs engineered tendon in a rabbit model.

    Science.gov (United States)

    Deng, Dan; Wang, Wenbo; Wang, Bin; Zhang, Peihua; Zhou, Guangdong; Zhang, Wen Jie; Cao, Yilin; Liu, Wei

    2014-10-01

    Adipose derived stem cells (ASCs) are an important cell source for tissue regeneration and have been demonstrated the potential of tenogenic differentiation in vitro. This study explored the feasibility of using ASCs for engineered tendon repair in vivo in a rabbit Achilles tendon model. Total 30 rabbits were involved in this study. A composite tendon scaffold composed of an inner part of polyglycolic acid (PGA) unwoven fibers and an outer part of a net knitted with PGA/PLA (polylactic acid) fibers was used to provide mechanical strength. Autologous ASCs were harvested from nuchal subcutaneous adipose tissues and in vitro expanded. The expanded ASCs were harvested and resuspended in culture medium and evenly seeded onto the scaffold in the experimental group, whereas cell-free scaffolds served as the control group. The constructs of both groups were cultured inside a bioreactor under dynamic stretch for 5 weeks. In each of 30 rabbits, a 2 cm defect was created on right side of Achilles tendon followed by the transplantation of a 3 cm cell-seeded scaffold in the experimental group of 15 rabbits, or by the transplantation of a 3 cm cell-free scaffold in the control group of 15 rabbits. Animals were sacrificed at 12, 21 and 45 weeks post-surgery for gross view, histology, and mechanical analysis. The results showed that short term in vitro culture enabled ASCs to produce matrix on the PGA fibers and the constructs showed tensile strength around 50 MPa in both groups (p > 0.05). With the increase of implantation time, cell-seeded constructs gradually form neo-tendon and became more mature at 45 weeks with histological structure similar to that of native tendon and with the presence of bipolar pattern and D-periodic structure of formed collagen fibrils. Additionally, both collagen fibril diameters and tensile strength increased continuously with significant difference among different time points (p tendon tissue with fibril structure observable only at 45 weeks

  12. Challenges to Software/Computing for Experimentation at the LHC

    Science.gov (United States)

    Banerjee, Sunanda

    The demands of future high energy physics experiments towards software and computing have led the experiments to plan the related activities as a full-fledged project and to investigate new methodologies and languages to meet the challenges. The paths taken by the four LHC experiments ALICE, ATLAS, CMS and LHCb are coherently put together in an LHC-wide framework based on Grid technology. The current status and understandings have been broadly outlined.

  13. Comparison of two three-dimensional cephalometric analysis computer software

    OpenAIRE

    Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek

    2014-01-01

    Background: Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Materials and Methods: Twenty cone beam computed tomography images were obtained using i-CAT® imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (Unive...

  14. 18 CFR Appendix 1 to Part 301 - ASC Utility Filing Template

    Science.gov (United States)

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false ASC Utility Filing Template 1 Appendix 1 to Part 301 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS FOR FEDERAL POWER MARKETING ADMINISTRATIONS AVERAGE SYSTEM COST...

  15. A Study of the Use of Ontologies for Building Computer-Aided Control Engineering Self-Learning Educational Software

    Science.gov (United States)

    García, Isaías; Benavides, Carmen; Alaiz, Héctor; Alonso, Angel

    2013-01-01

    This paper describes research on the use of knowledge models (ontologies) for building computer-aided educational software in the field of control engineering. Ontologies are able to represent in the computer a very rich conceptual model of a given domain. This model can be used later for a number of purposes in different software applications. In…

  16. Madagascar: open-source software project for multidimensional data analysis and reproducible computational experiments

    Directory of Open Access Journals (Sweden)

    Sergey Fomel

    2013-11-01

    Full Text Available The Madagascar software package is designed for analysis of large-scale multidimensional data, such as those occurring in exploration geophysics. Madagascar provides a framework for reproducible research. By “reproducible research” we refer to the discipline of attaching software codes and data to computational results reported in publications. The package contains a collection of (a computational modules, (b data-processing scripts, and (c research papers. Madagascar is distributed on SourceForge under a GPL v2 license https://sourceforge.net/projects/rsf/. By October 2013, more than 70 people from different organizations around the world have contributed to the project, with increasing year-to-year activity. The Madagascar website is http://www.ahay.org/.

  17. Application of software quality assurance methods in validation and maintenance of reactor analysis computer codes

    International Nuclear Information System (INIS)

    Reznik, L.

    1994-01-01

    Various computer codes employed at Israel Electricity Company for preliminary reactor design analysis and fuel cycle scoping calculations have been often subject to program source modifications. Although most changes were due to computer or operating system compatibility problems, a number of significant modifications were due to model improvement and enhancements of algorithm efficiency and accuracy. With growing acceptance of software quality assurance requirements and methods, a program of implementing extensive testing of modified software has been adopted within the regular maintenance activities. In this work survey has been performed of various software quality assurance methods of software testing which belong mainly to the two major categories of implementation ('white box') and specification-based ('black box') testing. The results of this survey exhibits a clear preference of specification-based testing. In particular the equivalence class partitioning method and the boundary value method have been selected as especially suitable functional methods for testing reactor analysis codes.A separate study of software quality assurance methods and techniques has been performed in this work objective to establish appropriate pre-test software specification methods. Two methods of software analysis and specification have been selected as the most suitable for this purpose: The method of data flow diagrams has been shown to be particularly valuable for performing the functional/procedural software specification while the entities - relationship diagrams has been approved to be efficient for specifying software data/information domain. Feasibility of these two methods has been analyzed in particular for software uncertainty analysis and overall code accuracy estimation. (author). 14 refs

  18. 77 FR 50723 - Verification, Validation, Reviews, and Audits for Digital Computer Software Used in Safety...

    Science.gov (United States)

    2012-08-22

    ... regulations with respect to software verification and auditing of digital computer software used in the safety... Standards and Records,'' which requires, in part, that a quality assurance program be established and implemented to provide adequate assurance that systems and components important to safety will satisfactorily...

  19. Technology survey of computer software as applicable to the MIUS project

    Science.gov (United States)

    Fulbright, B. E.

    1975-01-01

    Existing computer software, available from either governmental or private sources, applicable to modular integrated utility system program simulation is surveyed. Several programs and subprograms are described to provide a consolidated reference, and a bibliography is included. The report covers the two broad areas of design simulation and system simulation.

  20. 78 FR 47014 - Configuration Management Plans for Digital Computer Software Used in Safety Systems of Nuclear...

    Science.gov (United States)

    2013-08-02

    .... ML12354A524. 3. Revision 1 of RG 1.170, ``Test Documentation for Digital Computer Software used in Safety... is in ADAMS at Accession No. ML12354A531. 4. Revision 1 of RG 1.171, ``Software Unit Testing for... Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission. ACTION...

  1. What's New in Software? Computers and the Writing Process: Strategies That Work.

    Science.gov (United States)

    Ellsworth, Nancy J.

    1990-01-01

    The computer can be a powerful tool to help students who are having difficulty learning the skills of prewriting, composition, revision, and editing. Specific software is suggested for each phase, as well as for classroom publishing. (Author/JDD)

  2. Agile Development of Various Computational Power Adaptive Web-Based Mobile-Learning Software Using Mobile Cloud Computing

    Science.gov (United States)

    Zadahmad, Manouchehr; Yousefzadehfard, Parisa

    2016-01-01

    Mobile Cloud Computing (MCC) aims to improve all mobile applications such as m-learning systems. This study presents an innovative method to use web technology and software engineering's best practices to provide m-learning functionalities hosted in a MCC-learning system as service. Components hosted by MCC are used to empower developers to create…

  3. A Graphical User Interface for the Computational Fluid Dynamics Software OpenFOAM

    OpenAIRE

    Melbø, Henrik Kaald

    2014-01-01

    A graphical user interface for the computational fluid dynamics software OpenFOAM has been constructed. OpenFOAM is a open source and powerful numerical software, but has much to be wanted in the field of user friendliness. In this thesis the basic operation of OpenFOAM will be introduced and the thesis will emerge in a graphical user interface written in PyQt. The graphical user interface will make the use of OpenFOAM simpler, and hopefully make this powerful tool more available for the gene...

  4. The Effects of Computer-Aided Design Software on Engineering Students' Spatial Visualisation Skills

    Science.gov (United States)

    Kösa, Temel; Karakus, Fatih

    2018-01-01

    The purpose of this study was to determine the influence of computer-aided design (CAD) software-based instruction on the spatial visualisation skills of freshman engineering students in a computer-aided engineering drawing course. A quasi-experimental design was applied, using the Purdue Spatial Visualization Test-Visualization of Rotations…

  5. Computer-aided software understanding systems to enhance confidence of scientific codes

    International Nuclear Information System (INIS)

    Sheng, G.; Oeren, T.I.

    1991-01-01

    A unique characteristic of nuclear waste disposal is the very long time span over which the combined engineered and natural containment system must remain effective: hundreds of thousands of years. Since there is no precedent in human history for such an endeavour, simulation with the use of computers is the only means we have of forecasting possible future outcomes quantitatively. The need for reliable models and software to make such forecasts so far into the future is obvious. One of the critical elements necessary to ensure reliability is the degree of reviewability of the computer program. Among others, there are two very important reasons for this. Firstly, if there is to be any chance at all of validating the conceptual models as implemented by the computer code, peer reviewers must be able to see and understand what the program is doing. It is all but impossible to achieve this understanding by just looking at the code due to possible unfamiliarity with the language and often due as well to the length and complexity of the code. Secondly, a thorough understanding of the code is also necessary to carry out code maintenance activities which include among others, error detection, error correction and code modification for purposes of enhancing its performance, functionality or to adapt it to a changed environment. The emerging concepts of computer-aided software understanding and reverse engineering can answer precisely these needs. This paper will discuss the role they can play in enhancing the confidence one has on computer codes and several examples will be provided. Finally a brief discussion of combining state-of-art forward engineering systems with reverse engineering systems will show how powerfully they can contribute to the overall quality assurance of a computer program. (13 refs., 7 figs.)

  6. Epistemic Opacity, Confirmation Holism and Technical Debt: Computer Simulation in the Light of Empirical Software Engineering

    OpenAIRE

    Newman , Julian

    2015-01-01

    Epistemic opacity vis a vis human agents has been presented as an essential, ineliminable characteristic of computer simulation models resulting from the characteristics of the human cognitive agent. This paper argues, on the contrary, that such epistemic opacity as does occur in computer simulations is not a consequence of human limitations but of a failure on the part of model developers to adopt good software engineering practice for managing human error and ensuring the software artefact ...

  7. Androgen receptor (AR) degradation enhancer ASC-J9® in an FDA-approved formulated solution suppresses castration resistant prostate cancer cell growth.

    Science.gov (United States)

    Cheng, Max A; Chou, Fu-Ju; Wang, Keliang; Yang, Rachel; Ding, Jie; Zhang, Qiaoxia; Li, Gonghui; Yeh, Shuyuan; Xu, Defeng; Chang, Chawnshang

    2018-03-28

    ASC-J9 ® is a recently-developed androgen receptor (AR)-degradation enhancer that effectively suppresses castration resistant prostate cancer (PCa) cell proliferation and invasion. The optimal half maximum inhibitory concentrations (IC 50 ) of ASC-J9 ® at various PCa cell confluences (20%, 50%, and 100%) were assessed via both short-term MTT growth assays and long-term clonogenic proliferation assays. Our results indicate that the IC 50 values for ASC-J9 ® increased with increasing cell confluency. The IC 50 values were significantly decreased in PCa AR-positive cells compared to PCa AR-negative cells or in normal prostate cells. This suggests that ASC-J9 ® may function mainly via targeting the AR-positive PCa cells with limited unwanted side-effects to suppress the surrounding normal prostate cells. Mechanism dissection indicated that ASC-J9 ® might function via altering the apoptosis signals to suppress the PCa AR-negative PC-3 cells. Preclinical studies using multiple in vitro PCa cell lines and an in vivo mouse model with xenografted castration-resistant PCa CWR22Rv1 cells demonstrated that ASC-J9 ® has similar AR degradation effects when dissolved in FDA-approved solvents, including DMSO, PEG-400:Tween-80 (95:5), DMA:Labrasol:Tween-80 (10:45:45), and DMA:Labrasol:Tween-20 (10:45:45). Together, results from preclinical studies suggest a potential new therapy with AR-degradation enhancer ASC-J9 ® may potentially be ready to be used in human clinical trials in order to better suppress PCa at later castration resistant stages. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Software Infrastructure for Computer-aided Drug Discovery and Development, a Practical Example with Guidelines.

    Science.gov (United States)

    Moretti, Loris; Sartori, Luca

    2016-09-01

    In the field of Computer-Aided Drug Discovery and Development (CADDD) the proper software infrastructure is essential for everyday investigations. The creation of such an environment should be carefully planned and implemented with certain features in order to be productive and efficient. Here we describe a solution to integrate standard computational services into a functional unit that empowers modelling applications for drug discovery. This system allows users with various level of expertise to run in silico experiments automatically and without the burden of file formatting for different software, managing the actual computation, keeping track of the activities and graphical rendering of the structural outcomes. To showcase the potential of this approach, performances of five different docking programs on an Hiv-1 protease test set are presented. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Current trends in hardware and software for brain-computer interfaces (BCIs).

    Science.gov (United States)

    Brunner, P; Bianchi, L; Guger, C; Cincotti, F; Schalk, G

    2011-04-01

    A brain-computer interface (BCI) provides a non-muscular communication channel to people with and without disabilities. BCI devices consist of hardware and software. BCI hardware records signals from the brain, either invasively or non-invasively, using a series of device components. BCI software then translates these signals into device output commands and provides feedback. One may categorize different types of BCI applications into the following four categories: basic research, clinical/translational research, consumer products, and emerging applications. These four categories use BCI hardware and software, but have different sets of requirements. For example, while basic research needs to explore a wide range of system configurations, and thus requires a wide range of hardware and software capabilities, applications in the other three categories may be designed for relatively narrow purposes and thus may only need a very limited subset of capabilities. This paper summarizes technical aspects for each of these four categories of BCI applications. The results indicate that BCI technology is in transition from isolated demonstrations to systematic research and commercial development. This process requires several multidisciplinary efforts, including the development of better integrated and more robust BCI hardware and software, the definition of standardized interfaces, and the development of certification, dissemination and reimbursement procedures.

  10. Current trends in hardware and software for brain-computer interfaces (BCIs)

    Science.gov (United States)

    Brunner, P.; Bianchi, L.; Guger, C.; Cincotti, F.; Schalk, G.

    2011-04-01

    A brain-computer interface (BCI) provides a non-muscular communication channel to people with and without disabilities. BCI devices consist of hardware and software. BCI hardware records signals from the brain, either invasively or non-invasively, using a series of device components. BCI software then translates these signals into device output commands and provides feedback. One may categorize different types of BCI applications into the following four categories: basic research, clinical/translational research, consumer products, and emerging applications. These four categories use BCI hardware and software, but have different sets of requirements. For example, while basic research needs to explore a wide range of system configurations, and thus requires a wide range of hardware and software capabilities, applications in the other three categories may be designed for relatively narrow purposes and thus may only need a very limited subset of capabilities. This paper summarizes technical aspects for each of these four categories of BCI applications. The results indicate that BCI technology is in transition from isolated demonstrations to systematic research and commercial development. This process requires several multidisciplinary efforts, including the development of better integrated and more robust BCI hardware and software, the definition of standardized interfaces, and the development of certification, dissemination and reimbursement procedures.

  11. Software quality assurance plan for the National Ignition Facility integrated computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project`s controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy`s (DOE`s) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project.

  12. Software quality assurance plan for the National Ignition Facility integrated computer control system

    International Nuclear Information System (INIS)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project's controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy's (DOE's) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project

  13. COMPUTER TOOLS OF DYNAMIC MATHEMATIC SOFTWARE AND METHODICAL PROBLEMS OF THEIR USE

    OpenAIRE

    Olena V. Semenikhina; Maryna H. Drushliak

    2014-01-01

    The article presents results of analyses of standard computer tools of dynamic mathematic software which are used in solving tasks, and tools on which the teacher can support in the teaching of mathematics. Possibility of the organization of experimental investigating of mathematical objects on the basis of these tools and the wording of new tasks on the basis of the limited number of tools, fast automated check are specified. Some methodological comments on application of computer tools and ...

  14. Low-frequency, low-magnitude vibrations (LFLM enhances chondrogenic differentiation potential of human adipose derived mesenchymal stromal stem cells (hASCs

    Directory of Open Access Journals (Sweden)

    Krzysztof Marycz

    2016-02-01

    Full Text Available The aim of this study was to evaluate if low-frequency, low-magnitude vibrations (LFLM could enhance chondrogenic differentiation potential of human adipose derived mesenchymal stem cells (hASCs with simultaneous inhibition of their adipogenic properties for biomedical purposes. We developed a prototype device that induces low-magnitude (0.3 g low-frequency vibrations with the following frequencies: 25, 35 and 45 Hz. Afterwards, we used human adipose derived mesenchymal stem cell (hASCS, to investigate their cellular response to the mechanical signals. We have also evaluated hASCs morphological and proliferative activity changes in response to each frequency. Induction of chondrogenesis in hASCs, under the influence of a 35 Hz signal leads to most effective and stable cartilaginous tissue formation through highest secretion of Bone Morphogenetic Protein 2 (BMP-2, and Collagen type II, with low concentration of Collagen type I. These results correlated well with appropriate gene expression level. Simultaneously, we observed significant up-regulation of α3, α4, β1 and β3 integrins in chondroblast progenitor cells treated with 35 Hz vibrations, as well as Sox-9. Interestingly, we noticed that application of 35 Hz frequencies significantly inhibited adipogenesis of hASCs. The obtained results suggest that application of LFLM vibrations together with stem cell therapy might be a promising tool in cartilage regeneration.

  15. Computational mathematics and mathematical computer software. Vychislitel'naia matematika i matematicheskoe obespechenie EVM

    Energy Technology Data Exchange (ETDEWEB)

    Tikhonov, A.N.; Samarskii, A.A.

    1985-01-01

    Various aspects of mathematical modeling and problem-oriented computer software are examined with reference to numerical methods in mathematical physics, methods for solving inverse problems, development of automatic systems for experimental data processing, and mathematical modeling in plasma physics. Papers are presented on some properties of difference schemes in one-dimensional gas dynamics, an algorithm for processing signals reflected from multipoint targets, and the application of simplified Navier-Stokes equations for calculating flow of a viscous gas past long bodies.

  16. ProjectQ: An Open Source Software Framework for Quantum Computing

    OpenAIRE

    Steiger, Damian S.; Häner, Thomas; Troyer, Matthias

    2016-01-01

    We introduce ProjectQ, an open source software effort for quantum computing. The first release features a compiler framework capable of targeting various types of hardware, a high-performance simulator with emulation capabilities, and compiler plug-ins for circuit drawing and resource estimation. We introduce our Python-embedded domain-specific language, present the features, and provide example implementations for quantum algorithms. The framework allows testing of quantum algorithms through...

  17. Interferência do ácido ascórbico na dosagem glicêmica - doi: 10.5102/ucs.v6i2.722

    OpenAIRE

    Aline Cardoso Barbosa; Tania Cristina Andrade

    2009-01-01

    Sabe-se que exames laboratoriais estão sujeitos a fatores interferentes. A literatura científica cita interferência do ácido ascórbico em dosagens que envolvam oxirredução, como a glicêmica. Devido a isto, buscou-se avaliar o efeito do ácido ascórbico sobre a dosagem glicêmica em dois tipos de soro controle submetidos a diferentes concentrações de ácido ascórbico, variando de 2,5 a 100 mg/dL. Utilizou-se um soro controle com valores normais de glicemia e outro com valores patológicos. Observo...

  18. Informatics in Radiology (infoRAD): personal computer security: part 2. Software Configuration and file protection.

    Science.gov (United States)

    Caruso, Ronald D

    2004-01-01

    Proper configuration of software security settings and proper file management are necessary and important elements of safe computer use. Unfortunately, the configuration of software security options is often not user friendly. Safe file management requires the use of several utilities, most of which are already installed on the computer or available as freeware. Among these file operations are setting passwords, defragmentation, deletion, wiping, removal of personal information, and encryption. For example, Digital Imaging and Communications in Medicine medical images need to be anonymized, or "scrubbed," to remove patient identifying information in the header section prior to their use in a public educational or research environment. The choices made with respect to computer security may affect the convenience of the computing process. Ultimately, the degree of inconvenience accepted will depend on the sensitivity of the files and communications to be protected and the tolerance of the user. Copyright RSNA, 2004

  19. Prevalence of cervical intraepithelial neoplasia grades II/III and cervical cancer in patients with cytological diagnosis of atypical squamous cells when high-grade intraepithelial lesions (ASC-H) cannot be ruled out.

    Science.gov (United States)

    Cytryn, Andréa; Russomano, Fábio Bastos; Camargo, Maria José de; Zardo, Lucília Maria Gama; Horta, Nilza Maria Sobral Rebelo; Fonseca, Rachel de Carvalho Silveira de Paula; Tristão, Maria Aparecida; Monteiro, Aparecida Cristina Sampaio

    2009-09-01

    The latest update of the Bethesda System divided the category of atypical squamous cells of undetermined significance (ASCUS) into ASC-US (undetermined significance) and ASC-H (high-grade intraepithelial lesion cannot be ruled out). The aims here were to measure the prevalence of pre-invasive lesions (cervical intraepithelial neoplasia, CIN II/III) and cervical cancer among patients referred to Instituto Fernandes Figueira (IFF) with ASC-H cytology, and compare them with ASC-US cases. Cross-sectional study with retrospective data collection, at the IFF Cervical Pathology outpatient clinic. ASCUS cases referred to IFF from November 1997 to September 2007 were reviewed according to the 2001 Bethesda System to reach cytological consensus. The resulting ASC-H and ASC-US cases, along with new cases, were analyzed relative to the outcome of interest. The histological diagnosis (or cytocolposcopic follow-up in cases without such diagnosis) was taken as the gold standard. The prevalence of CIN II/III in cases with ASC-H cytology was 19.29% (95% confidence interval, CI, 9.05-29.55%) and the risk of these lesions was greater among patients with ASC-H than with ASC-US cytology (prevalence ratio, PR, 10.42; 95% CI, 2.39-45.47; P = 0.0000764). Pre-invasive lesions were more frequently found in patients under 50 years of age with ASC-H cytology (PR, 2.67; 95% CI, 0.38-18.83); P = 0.2786998). There were no uterine cervical cancer cases. The prevalence of CIN II/III in patients with ASC-H cytology was significantly higher than with ASC-US, and division into ASC diagnostic subcategories had good capacity for discriminating the presence of pre-invasive lesions.

  20. Towards early software reliability prediction for computer forensic tools (case study).

    Science.gov (United States)

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  1. The Astrobiology in Secondary Classrooms (ASC) curriculum: focusing upon diverse students and teachers.

    Science.gov (United States)

    Arino de la Rubia, Leigh S

    2012-09-01

    The Minority Institution Astrobiology Collaborative (MIAC) began working with the NASA Goddard Center for Astrobiology in 2003 to develop curriculum materials for high school chemistry and Earth science classes based on astrobiology concepts. The Astrobiology in Secondary Classrooms (ASC) modules emphasize interdisciplinary connections in astronomy, biology, chemistry, geoscience, physics, mathematics, and ethics through hands-on activities that address national educational standards. Field-testing of the Astrobiology in Secondary Classrooms materials occurred over three years in eight U.S. locations, each with populations that are underrepresented in the career fields of science, technology, engineering, and mathematics. Analysis of the educational research upon the high school students participating in the ASC project showed statistically significant increases in students' perceived knowledge and science reasoning. The curriculum is in its final stages, preparing for review to become a NASA educational product.

  2. Development of software for computing forming information using a component based approach

    Directory of Open Access Journals (Sweden)

    Kwang Hee Ko

    2009-12-01

    Full Text Available In shipbuilding industry, the manufacturing technology has advanced at an unprecedented pace for the last decade. As a result, many automatic systems for cutting, welding, etc. have been developed and employed in the manufacturing process and accordingly the productivity has been increased drastically. Despite such improvement in the manufacturing technology, however, development of an automatic system for fabricating a curved hull plate remains at the beginning stage since hardware and software for the automation of the curved hull fabrication process should be developed differently depending on the dimensions of plates, forming methods and manufacturing processes of each shipyard. To deal with this problem, it is necessary to create a “plug-in” framework, which can adopt various kinds of hardware and software to construct a full automatic fabrication system. In this paper, a framework for automatic fabrication of curved hull plates is proposed, which consists of four components and related software. In particular the software module for computing fabrication information is developed by using the ooCBD development methodology, which can interface with other hardware and software with minimum effort. Examples of the proposed framework applied to medium and large shipyards are presented.

  3. Prevalence of cervical intraepithelial neoplasia grades II/III and cervical cancer in patients with cytological diagnosis of atypical squamous cells when high-grade intraepithelial lesions (ASC-H cannot be ruled out

    Directory of Open Access Journals (Sweden)

    Andréa Cytryn

    Full Text Available CONTEXT AND OBJECTIVE: The latest update of the Bethesda System divided the category of atypical squamous cells of undetermined significance (ASCUS into ASC-US (undetermined significance and ASC-H (high-grade intraepithelial lesion cannot be ruled out. The aims here were to measure the prevalence of pre-invasive lesions (cervical intraepithelial neoplasia, CIN II/III and cervical cancer among patients referred to Instituto Fernandes Figueira (IFF with ASC-H cytology, and compare them with ASC-US cases. DESIGN AND SETTING: Cross-sectional study with retrospective data collection, at the IFF Cervical Pathology outpatient clinic. METHODS: ASCUS cases referred to IFF from November 1997 to September 2007 were reviewed according to the 2001 Bethesda System to reach cytological consensus. The resulting ASC-H and ASC-US cases, along with new cases, were analyzed relative to the outcome of interest. The histological diagnosis (or cytocolposcopic follow-up in cases without such diagnosis was taken as the gold standard. RESULTS: The prevalence of CIN II/III in cases with ASC-H cytology was 19.29% (95% confidence interval, CI, 9.05-29.55% and the risk of these lesions was greater among patients with ASC-H than with ASC-US cytology (prevalence ratio, PR, 10.42; 95% CI, 2.39-45.47; P = 0.0000764. Pre-invasive lesions were more frequently found in patients under 50 years of age with ASC-H cytology (PR, 2.67; 95% CI, 0.38-18.83; P = 0.2786998. There were no uterine cervical cancer cases. CONCLUSION: The prevalence of CIN II/III in patients with ASC-H cytology was significantly higher than with ASC-US, and division into ASC diagnostic subcategories had good capacity for discriminating the presence of pre-invasive lesions.

  4. Programming Languages or Generic Software Tools, for Beginners' Courses in Computer Literacy?

    Science.gov (United States)

    Neuwirth, Erich

    1987-01-01

    Discussion of methods that can be used to teach beginner courses in computer literacy focuses on students aged 10-12. The value of using a programing language versus using a generic software package is highlighted; Logo and Prolog are reviewed; and the use of databases is discussed. (LRW)

  5. CMS Computing Software and Analysis Challenge 2006

    Energy Technology Data Exchange (ETDEWEB)

    De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)

    2007-10-15

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  6. CMS Computing Software and Analysis Challenge 2006

    International Nuclear Information System (INIS)

    De Filippis, N.

    2007-01-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work

  7. The future of commodity computing and many-core versus the interests of HEP software

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    As the mainstream computing world has shifted from multi-core to many-core platforms, the situation for software developers has changed as well. With the numerous hardware and software options available, choices balancing programmability and performance are becoming a significant challenge. The expanding multiplicative dimensions of performance offer a growing number of possibilities that need to be assessed and addressed on several levels of abstraction. This paper reviews the major tradeoffs forced upon the software domain by the changing landscape of parallel technologies – hardware and software alike. Recent developments, paradigms and techniques are considered with respect to their impact on the rather traditional HEP programming models. Other considerations addressed include aspects of efficiency and reasonably achievable targets for the parallelization of large scale HEP workloads.

  8. The Model of the Software Running on a Computer Equipment Hardware Included in the Grid network

    Directory of Open Access Journals (Sweden)

    T. A. Mityushkina

    2012-12-01

    Full Text Available A new approach to building a cloud computing environment using Grid networks is proposed in this paper. The authors describe the functional capabilities, algorithm, model of software running on a computer equipment hardware included in the Grid network, that will allow to implement cloud computing environment using Grid technologies.

  9. Predicting Forearm Physical Exposures During Computer Work Using Self-Reports, Software-Recorded Computer Usage Patterns, and Anthropometric and Workstation Measurements.

    Science.gov (United States)

    Huysmans, Maaike A; Eijckelhof, Belinda H W; Garza, Jennifer L Bruno; Coenen, Pieter; Blatter, Birgitte M; Johnson, Peter W; van Dieën, Jaap H; van der Beek, Allard J; Dennerlein, Jack T

    2017-12-15

    Alternative techniques to assess physical exposures, such as prediction models, could facilitate more efficient epidemiological assessments in future large cohort studies examining physical exposures in relation to work-related musculoskeletal symptoms. The aim of this study was to evaluate two types of models that predict arm-wrist-hand physical exposures (i.e. muscle activity, wrist postures and kinematics, and keyboard and mouse forces) during computer use, which only differed with respect to the candidate predicting variables; (i) a full set of predicting variables, including self-reported factors, software-recorded computer usage patterns, and worksite measurements of anthropometrics and workstation set-up (full models); and (ii) a practical set of predicting variables, only including the self-reported factors and software-recorded computer usage patterns, that are relatively easy to assess (practical models). Prediction models were build using data from a field study among 117 office workers who were symptom-free at the time of measurement. Arm-wrist-hand physical exposures were measured for approximately two hours while workers performed their own computer work. Each worker's anthropometry and workstation set-up were measured by an experimenter, computer usage patterns were recorded using software and self-reported factors (including individual factors, job characteristics, computer work behaviours, psychosocial factors, workstation set-up characteristics, and leisure-time activities) were collected by an online questionnaire. We determined the predictive quality of the models in terms of R2 and root mean squared (RMS) values and exposure classification agreement to low-, medium-, and high-exposure categories (in the practical model only). The full models had R2 values that ranged from 0.16 to 0.80, whereas for the practical models values ranged from 0.05 to 0.43. Interquartile ranges were not that different for the two models, indicating that only for some

  10. Goethe Gossips with Grass: Using Computer Chatting Software in an Introductory Literature Course.

    Science.gov (United States)

    Fraser, Catherine C.

    1999-01-01

    Students in a third-year introduction to German literature course chatted over networked computers, using "FirstClass" software. A brief description of the course design is provided with detailed information on how the three chat sessions were organized. (Author/VWL)

  11. Improvement of Computer Software Quality through Software Automated Tools.

    Science.gov (United States)

    1986-08-30

    information that are returned from the tools to the human user, and the forms in which these outputs are presented. Page 2 of 4 STAGE OF DEVELOPMENT: What... AUTOMIATED SOFTWARE TOOL MONITORING SYSTEM APPENDIX 2 2-1 INTRODUCTION This document and Automated Software Tool Monitoring Program (Appendix 1) are...t Output Output features provide links from the tool to both the human user and the target machine (where applicable). They describe the types

  12. X-ray image processing software for computing object size and object location coordinates from acquired optical and x-ray images

    International Nuclear Information System (INIS)

    Tiwari, Akash; Tiwari, Shyam Sunder; Tiwari, Railesha; Panday, Lokesh; Panday, Jeet; Suri, Nitin

    2004-01-01

    X-ray and Visible image data processing software has been developed in Visual Basic for real time online and offline image information processing for NDT and Medical Applications. Software computes two dimension image size parameters from its sharp boundary lines by raster scanning the image contrast data. Code accepts bit map image data and hunts for multiple tumors of different sizes that may be present in the image definition and then computes size of each tumor and locates its approximate center for registering its location coordinates. Presence of foreign metal and glass balls industrial product such as chocolate and other food items imaged out using x-ray imaging technique are detected by the software and their size and position co-ordinates are computed by the software. Paper discusses ways and means to compute size and coordinated of air bubble like objects present in the x-ray and optical images and their multiple existences in image of interest. (author)

  13. The Impact Of Using Computer Software On Vocabulary Learning Of Iranian EFL University Students

    Directory of Open Access Journals (Sweden)

    Samira Pahlavanpoorfard

    2014-07-01

    Full Text Available Today, using computer is common in all fields. Education is not an exception. In fact, this approach of technology has been using increasingly in language classrooms. We have witnessed there are more and more language teachers are using computers in their classrooms. This research study investigates the impact of using computer   on vocabulary learning of Iranian EFL university students. To this end, a sample of 40 university students in Islamic Azad University, Larestan branch were randomly assigned into the experimental and control groups. Prior the treatment and to catch the initial deferences between the participants, all the students sat for a pre-test that was an Oxford Placement Test. Then the students were received the treatment for 10 weeks. The students in the experimental group were taught by computer software for vocabulary learning while the students in the control group were taught through traditional method for vocabulary learning. After the treatment, all the students sat for a post-test. The statistical analysis through running Independent-Sample T-tests revealed thatthe students in the experimental group who used the computer software for vocabulary learning performed better than the students in the control group were taught through traditional method for vocabulary learning.

  14. Software Development Processes Applied to Computational Icing Simulation

    Science.gov (United States)

    Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.

    1999-01-01

    The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.

  15. DiFX: A software correlator for very long baseline interferometry using multi-processor computing environments

    OpenAIRE

    Deller, A. T.; Tingay, S. J.; Bailes, M.; West, C.

    2007-01-01

    We describe the development of an FX style correlator for Very Long Baseline Interferometry (VLBI), implemented in software and intended to run in multi-processor computing environments, such as large clusters of commodity machines (Beowulf clusters) or computers specifically designed for high performance computing, such as multi-processor shared-memory machines. We outline the scientific and practical benefits for VLBI correlation, these chiefly being due to the inherent flexibility of softw...

  16. A software package to process an INIS magnetic tape on the VAX computer

    International Nuclear Information System (INIS)

    Omar, A.A.; Mohamed, F.A.

    1991-01-01

    This paper presents a software package whose function is to process the magnetic tapes distributed by the Atomic Energy Agency, on the VAX computers. These tapes contain abstracts of papers in the different branches of nuclear field and is supplied from the international Nuclear Information system (INIS). Two goals are aimed from this paper. First it gives a procedure to process any foreign magnetic tape on the VAX computers. Second, it solves the problem of reading the INIS tapes on a non IBM computer and thus allowing the specialists to gain from the large amount of information contained in these tapes. 11 fig

  17. A real-time computer simulation of nuclear simulator software using standard PC hardware and linux environments

    International Nuclear Information System (INIS)

    Cha, K. H.; Kweon, K. C.

    2001-01-01

    A feasibility study, which standard PC hardware and Real-Time Linux are applied to real-time computer simulation of software for a nuclear simulator, is presented in this paper. The feasibility prototype was established with the existing software in the Compact Nuclear Simulator (CNS). Throughout the real-time implementation in the feasibility prototype, we has identified that the approach can enable the computer-based predictive simulation to be approached, due to both the remarkable improvement in real-time performance and the less efforts for real-time implementation under standard PC hardware and Real-Time Linux envrionments

  18. Differential regulation of caspase-1 activation, pyroptosis, and autophagy via Ipaf and ASC in Shigella-infected macrophages.

    Directory of Open Access Journals (Sweden)

    Toshihiko Suzuki

    2007-08-01

    Full Text Available Shigella infection, the cause of bacillary dysentery, induces caspase-1 activation and cell death in macrophages, but the precise mechanisms of this activation remain poorly understood. We demonstrate here that caspase-1 activation and IL-1beta processing induced by Shigella are mediated through Ipaf, a cytosolic pattern-recognition receptor of the nucleotide-binding oligomerization domain (NOD-like receptor (NLR family, and the adaptor protein apoptosis-associated speck-like protein containing a C-terminal caspase recruitment domain (ASC. We also show that Ipaf was critical for pyroptosis, a specialized form of caspase-1-dependent cell death induced in macrophages by bacterial infection, whereas ASC was dispensable. Unlike that observed in Salmonella and Legionella, caspase-1 activation induced by Shigella infection was independent of flagellin. Notably, infection of macrophages with Shigella induced autophagy, which was dramatically increased by the absence of caspase-1 or Ipaf, but not ASC. Autophagy induced by Shigella required an intact bacterial type III secretion system but not VirG protein, a bacterial factor required for autophagy in epithelial-infected cells. Treatment of macrophages with 3-methyladenine, an inhibitor of autophagy, enhanced pyroptosis induced by Shigella infection, suggesting that autophagy protects infected macrophages from pyroptosis. Thus, Ipaf plays a critical role in caspase-1 activation induced by Shigella independently of flagellin. Furthermore, the absence of Ipaf or caspase-1, but not ASC, regulates pyroptosis and the induction of autophagy in Shigella-infected macrophages, providing a novel function for NLR proteins in bacterial-host interactions.

  19. ARCHER, a new Monte Carlo software tool for emerging heterogeneous computing environments

    International Nuclear Information System (INIS)

    Xu, X. George; Liu, Tianyu; Su, Lin; Du, Xining; Riblett, Matthew; Ji, Wei; Gu, Deyang; Carothers, Christopher D.; Shephard, Mark S.; Brown, Forrest B.; Kalra, Mannudeep K.; Liu, Bob

    2015-01-01

    Highlights: • A fast Monte Carlo based radiation transport code ARCHER was developed. • ARCHER supports different hardware including CPU, GPU and Intel Xeon Phi coprocessor. • Code is benchmarked again MCNP for medical applications. • A typical CT scan dose simulation only takes 6.8 s on an NVIDIA M2090 GPU. • GPU and coprocessor-based codes are 5–8 times faster than the CPU-based codes. - Abstract: The Monte Carlo radiation transport community faces a number of challenges associated with peta- and exa-scale computing systems that rely increasingly on heterogeneous architectures involving hardware accelerators such as GPUs and Xeon Phi coprocessors. Existing Monte Carlo codes and methods must be strategically upgraded to meet emerging hardware and software needs. In this paper, we describe the development of a software, called ARCHER (Accelerated Radiation-transport Computations in Heterogeneous EnviRonments), which is designed as a versatile testbed for future Monte Carlo codes. Preliminary results from five projects in nuclear engineering and medical physics are presented

  20. The shift to Cloud Computing : The impact of disruptive technology on the enterprise software business ecosystem

    NARCIS (Netherlands)

    Nieuwenhuis, Lambert J.M.; Ehrenhard, Michel L.; Prause, Lars

    2017-01-01

    The rapid diffusion of Cloud Computing influences the way enterprise software is developed, distributed, and implemented. This uptake of Cloud Computing has profound implications for the IT industry and related industries, as it does not only affect the vendors' business models but also the other

  1. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  2. Estabilidade de ácido ascórbico em sucos de frutas frescos sob diferentes formas de armazenamento

    OpenAIRE

    Cunha, Kelly Damasceno; Silva, Priscila Ribeiro da; Costa, Ana Lígia Faria e Silva da Fonseca; Teodoro, Anderson Junger; Koblitz, Maria Gabriela Bello

    2014-01-01

    O ácido ascórbico é uma vitamina hidrossolúvel de importância nutricional há muito estabelecida, por sua atuação como cofator em diversos processos fisiológicos e como antioxidante. O ser humano depende da ingestão diária desse micronutriente, cujas principais fontes são as frutas e hortaliças. Por ser um nutriente menos estável, o ácido ascórbico sofre perdas no processamento e no armazenamento, influenciadas por diversos fatores, como pH, temperatura, presença de íons, etc. A literatura apr...

  3. QuBiLS-MIDAS: a parallel free-software for molecular descriptors computation based on multilinear algebraic maps.

    Science.gov (United States)

    García-Jacas, César R; Marrero-Ponce, Yovani; Acevedo-Martínez, Liesner; Barigye, Stephen J; Valdés-Martiní, José R; Contreras-Torres, Ernesto

    2014-07-05

    The present report introduces the QuBiLS-MIDAS software belonging to the ToMoCoMD-CARDD suite for the calculation of three-dimensional molecular descriptors (MDs) based on the two-linear (bilinear), three-linear, and four-linear (multilinear or N-linear) algebraic forms. Thus, it is unique software that computes these tensor-based indices. These descriptors, establish relations for two, three, and four atoms by using several (dis-)similarity metrics or multimetrics, matrix transformations, cutoffs, local calculations and aggregation operators. The theoretical background of these N-linear indices is also presented. The QuBiLS-MIDAS software was developed in the Java programming language and employs the Chemical Development Kit library for the manipulation of the chemical structures and the calculation of the atomic properties. This software is composed by a desktop user-friendly interface and an Abstract Programming Interface library. The former was created to simplify the configuration of the different options of the MDs, whereas the library was designed to allow its easy integration to other software for chemoinformatics applications. This program provides functionalities for data cleaning tasks and for batch processing of the molecular indices. In addition, it offers parallel calculation of the MDs through the use of all available processors in current computers. The studies of complexity of the main algorithms demonstrate that these were efficiently implemented with respect to their trivial implementation. Lastly, the performance tests reveal that this software has a suitable behavior when the amount of processors is increased. Therefore, the QuBiLS-MIDAS software constitutes a useful application for the computation of the molecular indices based on N-linear algebraic maps and it can be used freely to perform chemoinformatics studies. Copyright © 2014 Wiley Periodicals, Inc.

  4. Tipo de respuesta según el genotipado del virus del papiloma humano según cobas 4800 en las lesiones asc-us HPV positivas

    OpenAIRE

    Kanjou Augé, Nadwa

    2016-01-01

    Objetivos: Evaluar el riesgo de patología subyacente en las pacientes ASC-US HPV positivas de nuestra área de influencia analizadas por el método COBAS 4800. Se estudió el riesgo de CIN2+ en el momento del diagnóstico de las mujeres ASC-US HPV positivas según el genotipo del virus del papiloma: HPV16, HPV18 ( ambos incluyendo coinfecciones) u otros HPV de alto riesgo (HR-HPV). El objetivo es detectar aquellas mujeres con citologías ASC-US HPV positivo con un mayor riesgo de progresión y por l...

  5. COMPUTER TOOLS OF DYNAMIC MATHEMATIC SOFTWARE AND METHODICAL PROBLEMS OF THEIR USE

    Directory of Open Access Journals (Sweden)

    Olena V. Semenikhina

    2014-08-01

    Full Text Available The article presents results of analyses of standard computer tools of dynamic mathematic software which are used in solving tasks, and tools on which the teacher can support in the teaching of mathematics. Possibility of the organization of experimental investigating of mathematical objects on the basis of these tools and the wording of new tasks on the basis of the limited number of tools, fast automated check are specified. Some methodological comments on application of computer tools and methodological features of the use of interactive mathematical environments are presented. Problems, which are arising from the use of computer tools, among which rethinking forms and methods of training by teacher, the search for creative problems, the problem of rational choice of environment, check the e-solutions, common mistakes in the use of computer tools are selected.

  6. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    Science.gov (United States)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  7. Avoidable Software Procurements

    Science.gov (United States)

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  8. Seeing red? : The agency of computer software in the production and management of students’ school absences

    OpenAIRE

    Bodén, Linnea

    2013-01-01

    An increasing number of Swedish municipalities use digital software to manage the registration of students’ school absences. The software is regarded as a problem-solving tool to make registration more efficient, but its effects on the educational setting have been largely neglected. Focusing on an event with two students from a class of 11-year-olds, the aim of the paper is to explore schools’ common uses of computer software for registering absence in order to understand how materialities –...

  9. Completion Report for Multi-Site Incentive MRT 2779 Implement ASC Tripod Initiative by 30SEP08

    Energy Technology Data Exchange (ETDEWEB)

    East, D; Cerutti, J; Noe, J; Cupps, K; Loncaric, J; Sturtevant, J

    2008-09-22

    This report provides documentation and evidence for the completion of the deployment of the Tripod common operating system (TripodOS, also known as and generally referred to below as TOSS). Background documents for TOSS are provided in Appendices A and B, including the initial TOSS proposal accepted by ASC HQ and Executives in July 2007 and a Governance Model defined by a Tri-Lab working group in September 2007. Appendix C contains a document that clarifies the intent and requirements for the completion criteria associated with MRT 2779. The deployment of TOSS is a Multi-Site Incentive from the ASC FY08-09 Implementation Plan due at the end of Quarter 4 in FY08.

  10. Exploring a business to business recurring revenue framework for the delivery of software as a service through a cloud computing channel

    OpenAIRE

    Dempsey, David

    2015-01-01

    Cloud Computing (CC) is creating a new paradigm for the distribution of computer software applications. Within this context CC enabled Software as a Service (SaaS) fundamentally changes the revenue expectations and business model for the application software industry. This study considers the revenue expectation of the CC industry and its dependency on renewal subscriptions, while the study focuses on SaaS in the Business-to-Business (B2B) domain, delivered through the CC chann...

  11. The Optimal Pricing of Computer Software and Other Products with High Switching Costs

    OpenAIRE

    Pekka Ahtiala

    2004-01-01

    The paper studies the determinants of the optimum prices of computer programs and their upgrades. It is based on the notion that because of the human capital invested in the use of a computer program by its user, this product has high switching costs, and on the finding that pirates are responsible for generating over 80 per cent of new software sales. A model to maximize the present value of the program to the program house is constructed to determine the optimal prices of initial programs a...

  12. Usefulness of Cone-Beam Computed Tomography and Automatic Vessel Detection Software in Emergency Transarterial Embolization

    Energy Technology Data Exchange (ETDEWEB)

    Carrafiello, Gianpaolo, E-mail: gcarraf@gmail.com; Ierardi, Anna Maria, E-mail: amierardi@yahoo.it; Duka, Ejona, E-mail: ejonaduka@hotmail.com [Insubria University, Department of Radiology, Interventional Radiology (Italy); Radaelli, Alessandro, E-mail: alessandro.radaelli@philips.com [Philips Healthcare (Netherlands); Floridi, Chiara, E-mail: chiara.floridi@gmail.com [Insubria University, Department of Radiology, Interventional Radiology (Italy); Bacuzzi, Alessandro, E-mail: alessandro.bacuzzi@ospedale.varese.it [University of Insubria, Anaesthesia and Palliative Care (Italy); Bucourt, Maximilian de, E-mail: maximilian.de-bucourt@charite.de [Charité - University Medicine Berlin, Department of Radiology (Germany); Marchi, Giuseppe De, E-mail: giuseppedemarchi@email.it [Insubria University, Department of Radiology, Interventional Radiology (Italy)

    2016-04-15

    BackgroundThis study was designed to evaluate the utility of dual phase cone beam computed tomography (DP-CBCT) and automatic vessel detection (AVD) software to guide transarterial embolization (TAE) of angiographically challenging arterial bleedings in emergency settings.MethodsTwenty patients with an arterial bleeding at computed tomography angiography and an inconclusive identification of the bleeding vessel at the initial 2D angiographic series were included. Accuracy of DP-CBCT and AVD software were defined as the ability to detect the bleeding site and the culprit arterial bleeder, respectively. Technical success was defined as the correct positioning of the microcatheter using AVD software. Clinical success was defined as the successful embolization. Total volume of iodinated contrast medium and overall procedure time were registered.ResultsThe bleeding site was not detected by initial angiogram in 20 % of cases, while impossibility to identify the bleeding vessel was the reason for inclusion in the remaining cases. The bleeding site was detected by DP-CBCT in 19 of 20 (95 %) patients; in one case CBCT-CT fusion was required. AVD software identified the culprit arterial branch in 18 of 20 (90 %) cases. In two cases, vessel tracking required manual marking of the candidate arterial bleeder. Technical success was 95 %. Successful embolization was achieved in all patients. Mean contrast volume injected for each patient was 77.5 ml, and mean overall procedural time was 50 min.ConclusionsC-arm CBCT and AVD software during TAE of angiographically challenging arterial bleedings is feasible and may facilitate successful embolization. Staff training in CBCT imaging and software manipulation is necessary.

  13. The Computer-based Health Evaluation Software (CHES: a software for electronic patient-reported outcome monitoring

    Directory of Open Access Journals (Sweden)

    Holzner Bernhard

    2012-11-01

    Full Text Available Abstract Background Patient-reported Outcomes (PROs capturing e.g., quality of life, fatigue, depression, medication side-effects or disease symptoms, have become important outcome parameters in medical research and daily clinical practice. Electronic PRO data capture (ePRO with software packages to administer questionnaires, storing data, and presenting results has facilitated PRO assessment in hospital settings. Compared to conventional paper-pencil versions of PRO instruments, ePRO is more economical with regard to staff resources and time, and allows immediate presentation of results to the medical staff. The objective of our project was to develop software (CHES – Computer-based Health Evaluation System for ePRO in hospital settings and at home with a special focus on the presentation of individual patient’s results. Methods Following the Extreme Programming development approach architecture was not fixed up-front, but was done in close, continuous collaboration with software end users (medical staff, researchers and patients to meet their specific demands. Developed features include sophisticated, longitudinal charts linking patients’ PRO data to clinical characteristics and to PRO scores from reference populations, a web-interface for questionnaire administration, and a tool for convenient creating and editing of questionnaires. Results By 2012 CHES has been implemented at various institutions in Austria, Germany, Switzerland, and the UK and about 5000 patients participated in ePRO (with around 15000 assessments in total. Data entry is done by the patients themselves via tablet PCs with a study nurse or an intern approaching patients and supervising questionnaire completion. Discussion During the last decade several software packages for ePRO have emerged for different purposes. Whereas commercial products are available primarily for ePRO in clinical trials, academic projects have focused on data collection and presentation in daily

  14. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  15. Software design for resilient computer systems

    CERN Document Server

    Schagaev, Igor

    2016-01-01

    This book addresses the question of how system software should be designed to account for faults, and which fault tolerance features it should provide for highest reliability. The authors first show how the system software interacts with the hardware to tolerate faults. They analyze and further develop the theory of fault tolerance to understand the different ways to increase the reliability of a system, with special attention on the role of system software in this process. They further develop the general algorithm of fault tolerance (GAFT) with its three main processes: hardware checking, preparation for recovery, and the recovery procedure. For each of the three processes, they analyze the requirements and properties theoretically and give possible implementation scenarios and system software support required. Based on the theoretical results, the authors derive an Oberon-based programming language with direct support of the three processes of GAFT. In the last part of this book, they introduce a simulator...

  16. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  17. Requirements Report Computer Software System for a Semi-Automatic Pipe Handling System and Fabrication Facility

    National Research Council Canada - National Science Library

    1980-01-01

    .... This report is to present the requirements of the computer software that must be developed to create Pipe Detail Drawings and to support the processing of the Pipe Detail Drawings through the Pipe Shop...

  18. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  19. R-1 (C-620-A) and R-2 (C-620-B) air compressor control logic, computer software description. Revision 1

    International Nuclear Information System (INIS)

    Walter, K.E.

    1995-01-01

    This document provides an updated computer software description for the software used on the FFTF R-1 (C-620-A) and R-2 (C-620-B) air compressor programmable controllers. Logic software design changes were required to allow automatic starting of a compressor that had not been previously started

  20. Development of computer tablet software for clinical quantification of lateral knee compartment translation during the pivot shift test.

    Science.gov (United States)

    Muller, Bart; Hofbauer, Marcus; Rahnemai-Azar, Amir Ata; Wolf, Megan; Araki, Daisuke; Hoshino, Yuichi; Araujo, Paulo; Debski, Richard E; Irrgang, James J; Fu, Freddie H; Musahl, Volker

    2016-01-01

    The pivot shift test is a commonly used clinical examination by orthopedic surgeons to evaluate knee function following injury. However, the test can only be graded subjectively by the examiner. Therefore, the purpose of this study is to develop software for a computer tablet to quantify anterior translation of the lateral knee compartment during the pivot shift test. Based on the simple image analysis method, software for a computer tablet was developed with the following primary design constraint - the software should be easy to use in a clinical setting and it should not slow down an outpatient visit. Translation of the lateral compartment of the intact knee was 2.0 ± 0.2 mm and for the anterior cruciate ligament-deficient knee was 8.9 ± 0.9 mm (p software provides reliable, objective, and quantitative data on translation of the lateral knee compartment during the pivot shift test and meets the design constraints posed by the clinical setting.

  1. IEEE Computer Society/Software Engineering Institute Software Process Achievement (SPA) Award 2009

    Science.gov (United States)

    2011-03-01

    capabilities to our GDM. We also introduced software as a service ( SaaS ) as part our technology solutions and have further enhanced our ability to...model PROSPER Infosys production support methodology Q&P quality and productivity R&D research and development SaaS software as a service ... Software Development Life Cycle (SDLC) 23 Table 10: Scientific Estimation Coverage by Service Line 27 CMU/SEI-2011-TR-008 | vi CMU/SEI-2011

  2. Containment and surveillance for software

    International Nuclear Information System (INIS)

    Andress, J.C.; Adams, G.N.; Cotton, J.H.

    1993-07-01

    Some operators and state authorities are offering their computer systems, both hardware and software, to be used for safeguards purposes by the International Atomic Energy Agency. Therefore a need exists to develop a method of authenticating the data produced by a computer program before it can be used by the Agency. As part of a complete Computer Systems Authentication (COMSAT) package, a method of software containment and surveillance has been developed to compliment existing software authentication techniques. The package is applicable to both operator and Agency provided systems. A program to demonstrate the principles has been written. With this facility, the Agency will be able to leave unattended software in the field, either to be used by the operator to generate data for inspection on their own computer, or to save an inspector having to re-install inspection-specific software on an Agency computer, in the knowledge that the operation of the protected computer is being continuously monitored. If adopted, either of these uses will enable the Agency to reduce their costs. (Author)

  3. Calculation Software versus Illustration Software for Teaching Statistics

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl; Boyle, Robin G.

    1999-01-01

    As personal computers have become more and more powerful, so have the software packages available to us for teaching statistics. This paper investigates what software packages are currently being used by progressive statistics instructors at university level, examines some of the deficiencies...... of such software, and indicates features that statistics instructors wish to have incorporated in software in the future. The basis of the paper is a survey of participants at ICOTS-5 (the Fifth International Conference on Teaching Statistics). These survey results, combined with the software based papers...

  4. Vertical bone measurements from cone beam computed tomography images using different software packages

    International Nuclear Information System (INIS)

    Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Livia Almeida Bueno; Freitas, Deborah Queiroz

    2015-01-01

    This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (‑0.11 and ‑0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p > 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data. (author)

  5. Vertical bone measurements from cone beam computed tomography images using different software packages

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Livia Almeida Bueno; Freitas, Deborah Queiroz, E-mail: tataventorini@hotmail.com [Universidade Estadual de Campinas (UNICAMP), Piracicaba, SP (Brazil). Faculdade de Odontologia

    2015-03-01

    This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (‑0.11 and ‑0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p > 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data. (author)

  6. Aspect-Oriented Software Development

    NARCIS (Netherlands)

    Filman, R.E.; Elrad, T.; Clarke, S.; Aksit, Mehmet; Unknown, [Unknown

    2004-01-01

    Software development is changing. The opportunities of the Internet, computerized businesses, and computer-savvy consumers, the exponential decline in the cost of computation and communication, and the increasingly dynamic environment for longer-living systems are pressing software developers to

  7. The effect of magnetic stimulation on the osteogenic and chondrogenic differentiation of human stem cells derived from the adipose tissue (hASCs)

    Science.gov (United States)

    Lima, João; Gonçalves, Ana I.; Rodrigues, Márcia T.; Reis, Rui L.; Gomes, Manuela E.

    2015-11-01

    The use of magnetic nanoparticles (MNPs) towards the musculoskeletal tissues has been the focus of many studies, regarding MNPs ability to promote and direct cellular stimulation and orient tissue responses. This is thought to be mainly achieved by mechano-responsive pathways, which can induce changes in cell behavior, including the processes of proliferation and differentiation, in response to external mechanical stimuli. Thus, the application of MNP-based strategies in tissue engineering may hold potential to propose novel solutions for cell therapy on bone and cartilage strategies to accomplish tissue regeneration. The present work aims at studying the influence of MNPs on the osteogenic and chondrogenic differentiation of human adipose derived stem cells (hASCs). MNPs were incorporated in hASCs and cultured in medium supplemented for osteogenic and chondrogenic differentiation. Cultures were maintained up to 28 days with/without an external magnetic stimulus provided by a magnetic bioreactor, to determine if the MNPs alone could affect the osteogenic or chondrogenic phenotype of the hASCs. Results indicate that the incorporation of MNPs does not negatively affect the viability nor the proliferation of hASCs. Furthermore, Alizarin Red staining evidences an enhancement in extracellular (ECM) mineralization under the influence of an external magnetic field. Although not as evident as for osteogenic differentiation, Toluidine blue and Safranin-O stainings also suggest the presence of a cartilage-like ECM with glycosaminoglycans and proteoglycans under the magnetic stimulus provided. Thus, MNPs incorporated in hASCs under the influence of an external magnetic field have the potential to induce differentiation towards the osteogenic and chondrogenic lineages.

  8. Statistical test data selection for reliability evalution of process computer software

    International Nuclear Information System (INIS)

    Volkmann, K.P.; Hoermann, H.; Ehrenberger, W.

    1976-01-01

    The paper presents a concept for converting knowledge about the characteristics of process states into practicable procedures for the statistical selection of test cases in testing process computer software. Process states are defined as vectors whose components consist of values of input variables lying in discrete positions or within given limits. Two approaches for test data selection, based on knowledge about cases of demand, are outlined referring to a purely probabilistic method and to the mathematics of stratified sampling. (orig.) [de

  9. Data systems and computer science: Software Engineering Program

    Science.gov (United States)

    Zygielbaum, Arthur I.

    1991-01-01

    An external review of the Integrated Technology Plan for the Civil Space Program is presented. This review is specifically concerned with the Software Engineering Program. The goals of the Software Engineering Program are as follows: (1) improve NASA's ability to manage development, operation, and maintenance of complex software systems; (2) decrease NASA's cost and risk in engineering complex software systems; and (3) provide technology to assure safety and reliability of software in mission critical applications.

  10. Monte Carlo simulation with the Gate software using grid computing

    International Nuclear Information System (INIS)

    Reuillon, R.; Hill, D.R.C.; Gouinaud, C.; El Bitar, Z.; Breton, V.; Buvat, I.

    2009-03-01

    Monte Carlo simulations are widely used in emission tomography, for protocol optimization, design of processing or data analysis methods, tomographic reconstruction, or tomograph design optimization. Monte Carlo simulations needing many replicates to obtain good statistical results can be easily executed in parallel using the 'Multiple Replications In Parallel' approach. However, several precautions have to be taken in the generation of the parallel streams of pseudo-random numbers. In this paper, we present the distribution of Monte Carlo simulations performed with the GATE software using local clusters and grid computing. We obtained very convincing results with this large medical application, thanks to the EGEE Grid (Enabling Grid for E-science), achieving in one week computations that could have taken more than 3 years of processing on a single computer. This work has been achieved thanks to a generic object-oriented toolbox called DistMe which we designed to automate this kind of parallelization for Monte Carlo simulations. This toolbox, written in Java is freely available on SourceForge and helped to ensure a rigorous distribution of pseudo-random number streams. It is based on the use of a documented XML format for random numbers generators statuses. (authors)

  11. Virtual network computing: cross-platform remote display and collaboration software.

    Science.gov (United States)

    Konerding, D E

    1999-04-01

    VNC (Virtual Network Computing) is a computer program written to address the problem of cross-platform remote desktop/application display. VNC uses a client/server model in which an image of the desktop of the server is transmitted to the client and displayed. The client collects mouse and keyboard input from the user and transmits them back to the server. The VNC client and server can run on Windows 95/98/NT, MacOS, and Unix (including Linux) operating systems. VNC is multi-user on Unix machines (any number of servers can be run are unrelated to the primary display of the computer), while it is effectively single-user on Macintosh and Windows machines (only one server can be run, displaying the contents of the primary display of the server). The VNC servers can be configured to allow more than one client to connect at one time, effectively allowing collaboration through the shared desktop. I describe the function of VNC, provide details of installation, describe how it achieves its goal, and evaluate the use of VNC for molecular modelling. VNC is an extremely useful tool for collaboration, instruction, software development, and debugging of graphical programs with remote users.

  12. The WHATs and HOWs of maturing computational and software engineering skills in Russian higher education institutions

    Science.gov (United States)

    Semushin, I. V.; Tsyganova, J. V.; Ugarov, V. V.; Afanasova, A. I.

    2018-05-01

    Russian higher education institutions' tradition of teaching large-enrolled classes is impairing student striving for individual prominence, one-upmanship, and hopes for originality. Intending to converting these drawbacks into benefits, a Project-Centred Education Model (PCEM) has been introduced to deliver Computational Mathematics and Information Science courses. The model combines a Frontal Competitive Approach and a Project-Driven Learning (PDL) framework. The PDL framework has been developed by stating and solving three design problems: (i) enhance the diversity of project assignments on specific computation methods algorithmic approaches, (ii) balance similarity and dissimilarity of the project assignments, and (iii) develop a software assessment tool suitable for evaluating the technological maturity of students' project deliverables and thus reducing instructor's workload and possible overlook. The positive experience accumulated over 15 years shows that implementing the PCEM keeps students motivated to strive for success in rising to higher levels of their computational and software engineering skills.

  13. Using software metrics and software reliability models to attain acceptable quality software for flight and ground support software for avionic systems

    Science.gov (United States)

    Lawrence, Stella

    1992-01-01

    This paper is concerned with methods of measuring and developing quality software. Reliable flight and ground support software is a highly important factor in the successful operation of the space shuttle program. Reliability is probably the most important of the characteristics inherent in the concept of 'software quality'. It is the probability of failure free operation of a computer program for a specified time and environment.

  14. Error Free Software

    Science.gov (United States)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  15. Science and Software

    Science.gov (United States)

    Zelt, C. A.

    2017-12-01

    Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site

  16. Gamma-Ray Spectrum Analysis Software GDA

    International Nuclear Information System (INIS)

    Wanabongse, P.

    1998-01-01

    The developmental work on computer software for gamma-ray spectrum analysis has been completed as a software package version 1.02 named GDA, which is an acronym for Gamma-spectrum Deconvolution and Analysis. The software package consists of three 3.5-inch diskettes for setup and a user's manual. GDA software can be installed for using on a personal computer with Windows 95 or Windows NT 4.0 operating system. A computer maybe the type of 80486 CPU with 8 megabytes of memory

  17. Use of Monte Carlo simulation software for calculating effective dose in cone beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Gomes B, W. O., E-mail: wilsonottobatista@gmail.com [Instituto Federal da Bahia, Rua Emidio dos Santos s/n, Barbalho 40301-015, Salvador de Bahia (Brazil)

    2016-10-15

    This study aimed to develop a geometry of irradiation applicable to the software PCXMC and the consequent calculation of effective dose in applications of the Computed Tomography Cone Beam (CBCT). We evaluated two different CBCT equipment s for dental applications: Care stream Cs 9000 3-dimensional tomograph; i-CAT and GENDEX GXCB-500. Initially characterize each protocol measuring the surface kerma input and the product kerma air-area, P{sub KA}, with solid state detectors RADCAL and PTW transmission chamber. Then we introduce the technical parameters of each preset protocols and geometric conditions in the PCXMC software to obtain the values of effective dose. The calculated effective dose is within the range of 9.0 to 15.7 μSv for 3-dimensional computer 9000 Cs; within the range 44.5 to 89 μSv for GXCB-500 equipment and in the range of 62-111 μSv for equipment Classical i-CAT. These values were compared with results obtained dosimetry using TLD implanted in anthropomorphic phantom and are considered consistent. Os effective dose results are very sensitive to the geometry of radiation (beam position in mathematical phantom). This factor translates to a factor of fragility software usage. But it is very useful to get quick answers to regarding process optimization tool conclusions protocols. We conclude that use software PCXMC Monte Carlo simulation is useful assessment protocols for CBCT tests in dental applications. (Author)

  18. A document-driven method for certifying scientific computing software for use in nuclear safety analysis

    International Nuclear Information System (INIS)

    Smith, W. Spencer; Koothoor, Mimitha

    2016-01-01

    This paper presents a documentation and development method to facilitate the certification of scientific computing software used in the safety analysis of nuclear facilities. To study the problems faced during quality assurance and certification activities, a case study was performed on legacy software used for thermal analysis of a fuel pin in a nuclear reactor. Although no errors were uncovered in the code, 27 issues of incompleteness and inconsistency were found with the documentation. This work proposes that software documentation follow a rational process, which includes a software requirements specification following a template that is reusable, maintainable, and understandable. To develop the design and implementation, this paper suggests literate programming as an alternative to traditional structured programming. Literate programming allows for documenting of numerical algorithms and code together in what is termed the literate programmer's manual. This manual is developed with explicit traceability to the software requirements specification. The traceability between the theory, numerical algorithms, and implementation facilitates achieving completeness and consistency, as well as simplifies the process of verification and the associated certification

  19. A document-driven method for certifying scientific computing software for use in nuclear safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, W. Spencer; Koothoor, Mimitha [Computing and Software Department, McMaster University, Hamilton (Canada)

    2016-04-15

    This paper presents a documentation and development method to facilitate the certification of scientific computing software used in the safety analysis of nuclear facilities. To study the problems faced during quality assurance and certification activities, a case study was performed on legacy software used for thermal analysis of a fuel pin in a nuclear reactor. Although no errors were uncovered in the code, 27 issues of incompleteness and inconsistency were found with the documentation. This work proposes that software documentation follow a rational process, which includes a software requirements specification following a template that is reusable, maintainable, and understandable. To develop the design and implementation, this paper suggests literate programming as an alternative to traditional structured programming. Literate programming allows for documenting of numerical algorithms and code together in what is termed the literate programmer's manual. This manual is developed with explicit traceability to the software requirements specification. The traceability between the theory, numerical algorithms, and implementation facilitates achieving completeness and consistency, as well as simplifies the process of verification and the associated certification.

  20. Scilab software as an alternative low-cost computing in solving the linear equations problem

    Science.gov (United States)

    Agus, Fahrul; Haviluddin

    2017-02-01

    Numerical computation packages are widely used both in teaching and research. These packages consist of license (proprietary) and open source software (non-proprietary). One of the reasons to use the package is a complexity of mathematics function (i.e., linear problems). Also, number of variables in a linear or non-linear function has been increased. The aim of this paper was to reflect on key aspects related to the method, didactics and creative praxis in the teaching of linear equations in higher education. If implemented, it could be contribute to a better learning in mathematics area (i.e., solving simultaneous linear equations) that essential for future engineers. The focus of this study was to introduce an additional numerical computation package of Scilab as an alternative low-cost computing programming. In this paper, Scilab software was proposed some activities that related to the mathematical models. In this experiment, four numerical methods such as Gaussian Elimination, Gauss-Jordan, Inverse Matrix, and Lower-Upper Decomposition (LU) have been implemented. The results of this study showed that a routine or procedure in numerical methods have been created and explored by using Scilab procedures. Then, the routine of numerical method that could be as a teaching material course has exploited.

  1. QRev—Software for computation and quality assurance of acoustic doppler current profiler moving-boat streamflow measurements—Technical manual for version 2.8

    Science.gov (United States)

    Mueller, David S.

    2016-06-21

    The software program, QRev applies common and consistent computational algorithms combined with automated filtering and quality assessment of the data to improve the quality and efficiency of streamflow measurements and helps ensure that U.S. Geological Survey streamflow measurements are consistent, accurate, and independent of the manufacturer of the instrument used to make the measurement. Software from different manufacturers uses different algorithms for various aspects of the data processing and discharge computation. The algorithms used by QRev to filter data, interpolate data, and compute discharge are documented and compared to the algorithms used in the manufacturers’ software. QRev applies consistent algorithms and creates a data structure that is independent of the data source. QRev saves an extensible markup language (XML) file that can be imported into databases or electronic field notes software. This report is the technical manual for version 2.8 of QRev.

  2. Three dimensional field computation software package DE3D and its applications

    International Nuclear Information System (INIS)

    Fan Mingwu; Zhang Tianjue; Yan Weili

    1992-07-01

    A software package, DE3D that can be run on PC for three dimensional electrostatic and magnetostatic field analysis has been developed in CIAE (China Institute of Atomic Energy). Two scalar potential method and special numerical techniques have made the code with high precision. It can be used for electrostatic and magnetostatic fields computations with complex boundary conditions. In the most cases, the result accuracy is better than 1% comparing with the measured. In some situations, the results are more acceptable than the other codes because some tricks are used for the current integral. Typical examples, design of a cyclotron magnet and magnetic elements on its beam transport line, given in the paper show how the program helps the designer to improve the design of the product. The software package could bring advantages to the producers and designers

  3. Software reliability assessment

    International Nuclear Information System (INIS)

    Barnes, M.; Bradley, P.A.; Brewer, M.A.

    1994-01-01

    The increased usage and sophistication of computers applied to real time safety-related systems in the United Kingdom has spurred on the desire to provide a standard framework within which to assess dependable computing systems. Recent accidents and ensuing legislation have acted as a catalyst in this area. One particular aspect of dependable computing systems is that of software, which is usually designed to reduce risk at the system level, but which can increase risk if it is unreliable. Various organizations have recognized the problem of assessing the risk imposed to the system by unreliable software, and have taken initial steps to develop and use such assessment frameworks. This paper relates the approach of Consultancy Services of AEA Technology in developing a framework to assess the risk imposed by unreliable software. In addition, the paper discusses the experiences gained by Consultancy Services in applying the assessment framework to commercial and research projects. The framework is applicable to software used in safety applications, including proprietary software. Although the paper is written with Nuclear Reactor Safety applications in mind, the principles discussed can be applied to safety applications in all industries

  4. Advanced Simulation and Computing Co-Design Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Ang, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoang, Thuc T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); McPherson, Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neely, Rob [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  5. Software And Systems Engineering Risk Management

    Science.gov (United States)

    2010-04-01

    RSKM 2004 COSO Enterprise RSKM Framework 2006 ISO/IEC 16085 Risk Management Process 2008 ISO/IEC 12207 Software Lifecycle Processes 2009 ISO/IEC...1 Software And Systems Engineering Risk Management John Walz VP Technical and Conferences Activities, IEEE Computer Society Vice-Chair Planning...Software & Systems Engineering Standards Committee, IEEE Computer Society US TAG to ISO TMB Risk Management Working Group Systems and Software

  6. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  7. Meteosat Second Generation station: processing software and computing arquitecture; Estacion de Recepcion de Imagenes del Satelite Meteosat Segunda generacion: Arquitectura Informatica y Software de Proceso

    Energy Technology Data Exchange (ETDEWEB)

    Martin, L; Cony, M; Navarro, A A; Zarzalejo, L F; Polo, J

    2010-05-01

    The Renewable Energy Division of CIEMAT houses a specific station for receiving the Meteosat Second Generation images, which is of interest in the works being carried out concerning the solar radiation derived from satellite images. The complexity, the huge amount of information being received and the particular characteristics of the MSG images encouraged the design and development of a specific computer structure and the associated software as well, for a better and more suitable use of the images. This document describes the mentioned structure and software. (Author) 8 refs.

  8. IMAGE information monitoring and applied graphics software environment. Volume 2. Software description

    International Nuclear Information System (INIS)

    Hallam, J.W.; Ng, K.B.; Upham, G.L.

    1986-09-01

    The EPRI Information Monitoring and Applied Graphics Environment (IMAGE) system is designed for 'fast proto-typing' of advanced concepts for computer-aided plant operations tools. It is a flexible software system which can be used for rapidly creating, dynamically driving and evaluating advanced operator aid displays. The software is written to be both host computer and graphic device independent

  9. Evaluating Accounting Software in Secondary Schools.

    Science.gov (United States)

    Chalupa, Marilyn

    1988-01-01

    The secondary accounting curriculum must be modified to include computers and software. Educators must be aware of the computer skills needed on the job and of the accounting software that is available. Software selection must be tailored to fit the curriculum and the time available. (JOW)

  10. The deoxyhypusine synthase mutant dys1-1 reveals the association of eIF5A and Asc1 with cell wall integrity.

    Directory of Open Access Journals (Sweden)

    Fabio Carrilho Galvão

    Full Text Available The putative eukaryotic translation initiation factor 5A (eIF5A is a highly conserved protein among archaea and eukaryotes that has recently been implicated in the elongation step of translation. eIF5A undergoes an essential and conserved posttranslational modification at a specific lysine to generate the residue hypusine. The enzymes deoxyhypusine synthase (Dys1 and deoxyhypusine hydroxylase (Lia1 catalyze this two-step modification process. Although several Saccharomyces cerevisiae eIF5A mutants have importantly contributed to the study of eIF5A function, no conditional mutant of Dys1 has been described so far. In this study, we generated and characterized the dys1-1 mutant, which showed a strong depletion of mutated Dys1 protein, resulting in more than 2-fold decrease in hypusine levels relative to the wild type. The dys1-1 mutant demonstrated a defect in total protein synthesis, a defect in polysome profile indicative of a translation elongation defect and a reduced association of eIF5A with polysomes. The growth phenotype of dys1-1 mutant is severe, growing only in the presence of 1 M sorbitol, an osmotic stabilizer. Although this phenotype is characteristic of Pkc1 cell wall integrity mutants, the sorbitol requirement from dys1-1 is not associated with cell lysis. We observed that the dys1-1 genetically interacts with the sole yeast protein kinase C (Pkc1 and Asc1, a component of the 40S ribosomal subunit. The dys1-1 mutant was synthetically lethal in combination with asc1Δ and overexpression of TIF51A (eIF5A or DYS1 is toxic for an asc1Δ strain. Moreover, eIF5A is more associated with translating ribosomes in the absence of Asc1 in the cell. Finally, analysis of the sensitivity to cell wall-perturbing compounds revealed a more similar behavior of the dys1-1 and asc1Δ mutants in comparison with the pkc1Δ mutant. These data suggest a correlated role for eIF5A and Asc1 in coordinating the translational control of a subset of m

  11. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  12. Computational resources for ribosome profiling: from database to Web server and software.

    Science.gov (United States)

    Wang, Hongwei; Wang, Yan; Xie, Zhi

    2017-08-14

    Ribosome profiling is emerging as a powerful technique that enables genome-wide investigation of in vivo translation at sub-codon resolution. The increasing application of ribosome profiling in recent years has achieved remarkable progress toward understanding the composition, regulation and mechanism of translation. This benefits from not only the awesome power of ribosome profiling but also an extensive range of computational resources available for ribosome profiling. At present, however, a comprehensive review on these resources is still lacking. Here, we survey the recent computational advances guided by ribosome profiling, with a focus on databases, Web servers and software tools for storing, visualizing and analyzing ribosome profiling data. This review is intended to provide experimental and computational biologists with a reference to make appropriate choices among existing resources for the question at hand. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  13. Manual on quality assurance for computer software related to the safety of nuclear power plants

    International Nuclear Information System (INIS)

    1988-01-01

    The objective of the Manual is to provide guidance in the assurance of quality of specification, design, maintenance and use of computer software related to items and activities important to safety (hereinafter referred to as safety related) in nuclear power plants. This guidance is consistent with, and supplements, the requirements and recommendations of Quality Assurance for Safety in Nuclear Power Plants: A Code of Practice, 50-C-QA, and related Safety Guides on quality assurance for nuclear power plants. Annex A identifies the IAEA documents referenced in the Manual. The Manual is intended to be of use to all those who, in any way, are involved with software for safety related applications for nuclear power plants, including auditors who may be called upon to audit management systems and product software. Figs

  14. Program software for the automated processing of gravity and magnetic survey data for the Mir computer

    Energy Technology Data Exchange (ETDEWEB)

    Lyubimov, G.A.

    1980-01-01

    A presentation is made of the content of program software for the automated processing of gravity and magnetic survey data for the small Mir-1 and Mir-2 computers as worked out on the Voronezh geophysical expedition.

  15. The effect of magnetic stimulation on the osteogenic and chondrogenic differentiation of human stem cells derived from the adipose tissue (hASCs)

    Energy Technology Data Exchange (ETDEWEB)

    Lima, João; Gonçalves, Ana I.; Rodrigues, Márcia T.; Reis, Rui L. [3Bs Research Group–Biomaterials, Biodegradables and Biomimetics, University of Minho, Guimarães (Portugal); ICVS/3Bs–PT Government Associate Laboratory, Braga/Guimarães (Portugal); Gomes, Manuela E., E-mail: megomes@dep.uminho.pt [3Bs Research Group–Biomaterials, Biodegradables and Biomimetics, University of Minho, Guimarães (Portugal); ICVS/3Bs–PT Government Associate Laboratory, Braga/Guimarães (Portugal)

    2015-11-01

    The use of magnetic nanoparticles (MNPs) towards the musculoskeletal tissues has been the focus of many studies, regarding MNPs ability to promote and direct cellular stimulation and orient tissue responses. This is thought to be mainly achieved by mechano-responsive pathways, which can induce changes in cell behavior, including the processes of proliferation and differentiation, in response to external mechanical stimuli. Thus, the application of MNP-based strategies in tissue engineering may hold potential to propose novel solutions for cell therapy on bone and cartilage strategies to accomplish tissue regeneration. The present work aims at studying the influence of MNPs on the osteogenic and chondrogenic differentiation of human adipose derived stem cells (hASCs). MNPs were incorporated in hASCs and cultured in medium supplemented for osteogenic and chondrogenic differentiation. Cultures were maintained up to 28 days with/without an external magnetic stimulus provided by a magnetic bioreactor, to determine if the MNPs alone could affect the osteogenic or chondrogenic phenotype of the hASCs. Results indicate that the incorporation of MNPs does not negatively affect the viability nor the proliferation of hASCs. Furthermore, Alizarin Red staining evidences an enhancement in extracellular (ECM) mineralization under the influence of an external magnetic field. Although not as evident as for osteogenic differentiation, Toluidine blue and Safranin-O stainings also suggest the presence of a cartilage-like ECM with glycosaminoglycans and proteoglycans under the magnetic stimulus provided. Thus, MNPs incorporated in hASCs under the influence of an external magnetic field have the potential to induce differentiation towards the osteogenic and chondrogenic lineages. - Highlights: • Cellular viability was not negatively influenced by the nanoparticles. • Chondrogenic medium influences more the synthesis of cartilage-like ECM than MNPs. • Synergetic effect among

  16. Genomic presence of gadD1 glutamate decarboxylase correlates with the organization of ascB-dapE internalin cluster in Listeria monocytogenes.

    Science.gov (United States)

    Chen, Jianshun; Fang, Chun; Zheng, Tianlun; Zhu, Ningyu; Bei, Yijiang; Fang, Weihuan

    2012-02-01

    The ability to survive and proliferate in acidic environments is a prerequisite for the infection of Listeria monocytogenes. The glutamate decarboxylase (GAD) system is responsible for acid resistance, and three GAD homologs have been identified in L. monocytogenes: gadD1, gadD2, and gadD3. To examine whether GAD genes are specific to lineage, serovar, or certain subpopulation, we performed a systematic investigation on the prevalence of GAD genes in 164 L. monocytogenes. In contrast to gadD2 and gadD3 conserved in all L. monocytogenes strains, gadD1 was identified in 36.6% (60/164) of L. monocytogenes strains, including all serovar 1/2c and 68.5% (37/54) of serovar 1/2a strains, as well as a small fraction of serovar 1/2b (3.4%, 1/29) and lineage III (13.8%, 4/29) strains. All serovar 4b and lineage IV strains lacked this gene. According to the ascB-dapE structure, L. monocytogenes strains were classified into four subpopulations, carrying inlC2DE, inlGC2DE, inlGHE, or no internalin cluster, respectively. All L. monocytogenes strains with inlGC2DE or inlGHE pattern harbored gadD1, whereas those bearing inlC2DE or no internalin cluster between ascB and dapE lacked gadD1. In addition, other five non-monocytogenes Listeria species lacking ascB-dapE internalin cluster were gadD1-negative. Overall, the presence of gadD1 is not fully dependent on lineages or serovars but correlates with ascB-dapE internalin profiles, suggesting gadD1 might have co-evolved with the ascB-dapE internalin cluster in the primitive L. monocytogenes before divergence of serovars.

  17. The effect of magnetic stimulation on the osteogenic and chondrogenic differentiation of human stem cells derived from the adipose tissue (hASCs)

    International Nuclear Information System (INIS)

    Lima, João; Gonçalves, Ana I.; Rodrigues, Márcia T.; Reis, Rui L.; Gomes, Manuela E.

    2015-01-01

    The use of magnetic nanoparticles (MNPs) towards the musculoskeletal tissues has been the focus of many studies, regarding MNPs ability to promote and direct cellular stimulation and orient tissue responses. This is thought to be mainly achieved by mechano-responsive pathways, which can induce changes in cell behavior, including the processes of proliferation and differentiation, in response to external mechanical stimuli. Thus, the application of MNP-based strategies in tissue engineering may hold potential to propose novel solutions for cell therapy on bone and cartilage strategies to accomplish tissue regeneration. The present work aims at studying the influence of MNPs on the osteogenic and chondrogenic differentiation of human adipose derived stem cells (hASCs). MNPs were incorporated in hASCs and cultured in medium supplemented for osteogenic and chondrogenic differentiation. Cultures were maintained up to 28 days with/without an external magnetic stimulus provided by a magnetic bioreactor, to determine if the MNPs alone could affect the osteogenic or chondrogenic phenotype of the hASCs. Results indicate that the incorporation of MNPs does not negatively affect the viability nor the proliferation of hASCs. Furthermore, Alizarin Red staining evidences an enhancement in extracellular (ECM) mineralization under the influence of an external magnetic field. Although not as evident as for osteogenic differentiation, Toluidine blue and Safranin-O stainings also suggest the presence of a cartilage-like ECM with glycosaminoglycans and proteoglycans under the magnetic stimulus provided. Thus, MNPs incorporated in hASCs under the influence of an external magnetic field have the potential to induce differentiation towards the osteogenic and chondrogenic lineages. - Highlights: • Cellular viability was not negatively influenced by the nanoparticles. • Chondrogenic medium influences more the synthesis of cartilage-like ECM than MNPs. • Synergetic effect among

  18. The Future of Software Engineering for High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Pope, G [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-16

    DOE ASCR requested that from May through mid-July 2015 a study group identify issues and recommend solutions from a software engineering perspective transitioning into the next generation of High Performance Computing. The approach used was to ask some of the DOE complex experts who will be responsible for doing this work to contribute to the study group. The technique used was to solicit elevator speeches: a short and concise write up done as if the author was a speaker with only a few minutes to convince a decision maker of their top issues. Pages 2-18 contain the original texts of the contributed elevator speeches and end notes identifying the 20 contributors. The study group also ranked the importance of each topic, and those scores are displayed with each topic heading. A perfect score (and highest priority) is three, two is medium priority, and one is lowest priority. The highest scoring topic areas were software engineering and testing resources; the lowest scoring area was compliance to DOE standards. The following two paragraphs are an elevator speech summarizing the contributed elevator speeches. Each sentence or phrase in the summary is hyperlinked to its source via a numeral embedded in the text. A risk one liner has also been added to each topic to allow future risk tracking and mitigation.

  19. The Computational Infrastructure for Geodynamics: An Example of Software Curation and Citation in the Geodynamics Community

    Science.gov (United States)

    Hwang, L.; Kellogg, L. H.

    2017-12-01

    Curation of software promotes discoverability and accessibility and works hand in hand with scholarly citation to ascribe value to, and provide recognition for software development. To meet this challenge, the Computational Infrastructure for Geodynamics (CIG) maintains a community repository built on custom and open tools to promote discovery, access, identification, credit, and provenance of research software for the geodynamics community. CIG (geodynamics.org) originated from recognition of the tremendous effort required to develop sound software and the need to reduce duplication of effort and to sustain community codes. CIG curates software across 6 domains and has developed and follows software best practices that include establishing test cases, documentation, and a citable publication for each software package. CIG software landing web pages provide access to current and past releases; many are also accessible through the CIG community repository on github. CIG has now developed abc - attribution builder for citation to enable software users to give credit to software developers. abc uses zenodo as an archive and as the mechanism to obtain a unique identifier (DOI) for scientific software. To assemble the metadata, we searched the software's documentation and research publications and then requested the primary developers to verify. In this process, we have learned that each development community approaches software attribution differently. The metadata gathered is based on guidelines established by groups such as FORCE11 and OntoSoft. The rollout of abc is gradual as developers are forward-looking, rarely willing to go back and archive prior releases in zenodo. Going forward all actively developed packages will utilize the zenodo and github integration to automate the archival process when a new release is issued. How to handle legacy software, multi-authored libraries, and assigning roles to software remain open issues.

  20. FMT (Flight Software Memory Tracker) For Cassini Spacecraft-Software Engineering Using JAVA

    Science.gov (United States)

    Kan, Edwin P.; Uffelman, Hal; Wax, Allan H.

    1997-01-01

    The software engineering design of the Flight Software Memory Tracker (FMT) Tool is discussed in this paper. FMT is a ground analysis software set, consisting of utilities and procedures, designed to track the flight software, i.e., images of memory load and updatable parameters of the computers on-board Cassini spacecraft. FMT is implemented in Java.

  1. Genomecmp: computer software to detect genomic rearrangements using markers

    Science.gov (United States)

    Kulawik, Maciej; Nowak, Robert M.

    2017-08-01

    Detection of genomics rearrangements is a tough task, because of the size of data to be processed. As genome sequences may consist of hundreds of millions symbols, it is not only practically impossible to compare them by hand, but it is also complex problem for computer software. The way to significantly accelerate the process is to use rearrangement detection algorithm based on unique short sequences called markers. The algorithm described in this paper develops markers using base genome and find the markers positions on other genome. The algorithm has been extended by support for ambiguity symbols. Web application with graphical user interface has been created using three-layer architecture, where users could run the task simultaneously. The accuracy and efficiency of proposed solution has been studied using generated and real data.

  2. Consumo de ácido ascórbico y niveles séricos en hombres adultos fumadores y no fumadores de la CD. de Hermosillo, Sonora, México

    OpenAIRE

    Méndez E, Rosa Olivia; Wyatt, C. Jane; Saavedra, Javier; Ornelas, Alicia

    2002-01-01

    El ácido ascórbico es uno de las antioxidantes más importantes a nivel extracelular, sin embargo su papel preventivo de enfermedades degenerativas se puede comprometer al disminuir sus niveles séricos. Bajos valores de ácido ascórbico sérico se han reportado en hombres fumadores. En el presente estudio se estimó la ingestión de ácido ascórbico en 25 hombres adultos sanos de Hermosillo, Sonora, México, divididos en dos grupos: fumadores y no fumadores; se compararon los niveles séricos de ácid...

  3. Micro-computer based control system and its software for carrying out the sequential acceleration on SMCAMS

    International Nuclear Information System (INIS)

    Li Deming

    2001-01-01

    Micro-computer based control system and its software for carrying out the sequential acceleration on SMCAMS is described. Also, the establishment of the 14 C particle measuring device and the improvement of the original power supply system are described

  4. Contractor Software Charges

    National Research Council Canada - National Science Library

    Granetto, Paul

    1994-01-01

    .... Examples of computer software costs that contractors charge through indirect rates are material management systems, security systems, labor accounting systems, and computer-aided design and manufacturing...

  5. Instrumentation, computer software and experimental techniques used in low-frequency internal friction studies at WNRE

    International Nuclear Information System (INIS)

    Sprugmann, K.W.; Ritchie, I.G.

    1980-04-01

    A detailed and comprehensive account of the equipment, computer programs and experimental methods developed at the Whiteshell Nuclear Research Estalbishment for the study of low-frequency internal friction is presented. Part 1 describes the mechanical apparatus, electronic instrumentation and computer software, while Part II describes in detail the laboratory techniques and various types of experiments performed together with data reduction and analysis. Experimental procedures for the study of internal friction as a function of temperature, strain amplitude or time are described. Computer control of these experiments using the free-decay technique is outlined. In addition, a pendulum constant-amplitude drive system is described. (auth)

  6. General-purpose software for science technology calculation

    International Nuclear Information System (INIS)

    Aikawa, Hiroshi

    1999-01-01

    We have developed many general-purpose softwares for parallel processing of science technology calculation. This paper reported six softwares such as STA (Seamless Thinking Aid) basic soft, parallel numerical computation library, grid formation software for parallel computer, real-time visualizing system, parallel benchmark test system and object-oriented parallel programing method. STA is a user interface software to perform a total environment for parallel programing, a network computing environment for various parallel computers and a desktop computing environment via Web. Some examples using the above softwares are explained. One of them is a simultaneous parallel calculation of both analysis of flow and structure of supersonic transport to design of them. The other is various kinds of computer parallel calculations for nuclear fusion reaction such as a molecular dynamic calculation and a calculation of reactor structure and fluid. These softs are opened to the public by the home page {http://guide.tokai.jaeri.go.jp/ccse/}. (S.Y.)

  7. Comparing Results from Constant Comparative and Computer Software Methods: A Reflection about Qualitative Data Analysis

    Science.gov (United States)

    Putten, Jim Vander; Nolen, Amanda L.

    2010-01-01

    This study compared qualitative research results obtained by manual constant comparative analysis with results obtained by computer software analysis of the same data. An investigated about issues of trustworthiness and accuracy ensued. Results indicated that the inductive constant comparative data analysis generated 51 codes and two coding levels…

  8. Quantitative software-reliability analysis of computer codes relevant to nuclear safety

    International Nuclear Information System (INIS)

    Mueller, C.J.

    1981-12-01

    This report presents the results of the first year of an ongoing research program to determine the probability of failure characteristics of computer codes relevant to nuclear safety. An introduction to both qualitative and quantitative aspects of nuclear software is given. A mathematical framework is presented which will enable the a priori prediction of the probability of failure characteristics of a code given the proper specification of its properties. The framework consists of four parts: (1) a classification system for software errors and code failures; (2) probabilistic modeling for selected reliability characteristics; (3) multivariate regression analyses to establish predictive relationships among reliability characteristics and generic code property and development parameters; and (4) the associated information base. Preliminary data of the type needed to support the modeling and the predictions of this program are described. Illustrations of the use of the modeling are given but the results so obtained, as well as all results of code failure probabilities presented herein, are based on data which at this point are preliminary, incomplete, and possibly non-representative of codes relevant to nuclear safety

  9. 48 CFR 252.227-7018 - Rights in noncommercial technical data and computer software-Small Business Innovation Research...

    Science.gov (United States)

    2010-10-01

    ... be at the stage where it could be offered for sale or sold on the commercial market, nor must the... software required for safekeeping (archive), backup, or modification purposes; (iv) Modify computer...

  10. Software usage in unsupervised digital doorway computing environments in disadvantaged South African communities: Focusing on youthful users

    CSIR Research Space (South Africa)

    Gush, K

    2011-01-01

    Full Text Available Digital Doorways provide computing infrastructure in low-income communities in South Africa. The unsupervised DD terminals offer various software applications, from entertainment through educational resources to research material, encouraging...

  11. 2nd International Workshop on Eigenvalue Problems : Algorithms, Software and Applications in Petascale Computing

    CERN Document Server

    Zhang, Shao-Liang; Imamura, Toshiyuki; Yamamoto, Yusaku; Kuramashi, Yoshinobu; Hoshi, Takeo

    2017-01-01

    This book provides state-of-the-art and interdisciplinary topics on solving matrix eigenvalue problems, particularly by using recent petascale and upcoming post-petascale supercomputers. It gathers selected topics presented at the International Workshops on Eigenvalue Problems: Algorithms; Software and Applications, in Petascale Computing (EPASA2014 and EPASA2015), which brought together leading researchers working on the numerical solution of matrix eigenvalue problems to discuss and exchange ideas – and in so doing helped to create a community for researchers in eigenvalue problems. The topics presented in the book, including novel numerical algorithms, high-performance implementation techniques, software developments and sample applications, will contribute to various fields that involve solving large-scale eigenvalue problems.

  12. THE TECHNIQUE OF ANALYSIS OF SOFTWARE OF ON-BOARD COMPUTERS OF AIR VESSEL TO ABSENCE OF UNDECLARED CAPABILITIES BY SIGNATURE-HEURISTIC WAY

    Directory of Open Access Journals (Sweden)

    Viktor Ivanovich Petrov

    2017-01-01

    Full Text Available The article considers the issues of civil aviation aircraft onboard computers data safety. Infor- mation security undeclared capabilities stand for technical equipment or software possibilities, which are not mentioned in the documentation. Documentation and tests content requirements are imposed during the software certification. Documentation requirements include documents composition and content of control (specification, description and program code, the source code. Test requirements include: static analysis of program codes (including the compliance of the sources with their loading modules monitoring; dynamic analysis of source code (including implementation of routes monitor- ing. Currently, there are no complex measures for checking onboard computer software. There are no rules and regulations that can allow controlling foreign production aircraft software, and the actual receiving of software is difficult. Consequently, the author suggests developing the basics of aviation rules and regulations, which allow to analyze the programs of CA aircraft onboard computers. If there are no software source codes the two approaches of code analysis are used: a structural static and dy- namic analysis of the source code; signature-heuristic analysis of potentially dangerous operations. Static analysis determines the behavior of the program by reading the program code (without running the program which is represented in the assembler language - disassembly listing. Program tracing is performed by the dynamic analysis. The analysis of aircraft software ability to detect undeclared capa- bilities using the interactive disassembler was considered in this article.

  13. 2011 Computation Directorate Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, D L

    2012-04-11

    From its founding in 1952 until today, Lawrence Livermore National Laboratory (LLNL) has made significant strategic investments to develop high performance computing (HPC) and its application to national security and basic science. Now, 60 years later, the Computation Directorate and its myriad resources and capabilities have become a key enabler for LLNL programs and an integral part of the effort to support our nation's nuclear deterrent and, more broadly, national security. In addition, the technological innovation HPC makes possible is seen as vital to the nation's economic vitality. LLNL, along with other national laboratories, is working to make supercomputing capabilities and expertise available to industry to boost the nation's global competitiveness. LLNL is on the brink of an exciting milestone with the 2012 deployment of Sequoia, the National Nuclear Security Administration's (NNSA's) 20-petaFLOP/s resource that will apply uncertainty quantification to weapons science. Sequoia will bring LLNL's total computing power to more than 23 petaFLOP/s-all brought to bear on basic science and national security needs. The computing systems at LLNL provide game-changing capabilities. Sequoia and other next-generation platforms will enable predictive simulation in the coming decade and leverage industry trends, such as massively parallel and multicore processors, to run petascale applications. Efficient petascale computing necessitates refining accuracy in materials property data, improving models for known physical processes, identifying and then modeling for missing physics, quantifying uncertainty, and enhancing the performance of complex models and algorithms in macroscale simulation codes. Nearly 15 years ago, NNSA's Accelerated Strategic Computing Initiative (ASCI), now called the Advanced Simulation and Computing (ASC) Program, was the critical element needed to shift from test-based confidence to science-based confidence

  14. Prediction of cervical intraepithelial neoplasia grade 2+ (CIN2+) using HPV DNA testing after a diagnosis of atypical squamous cell of undetermined significance (ASC-US) in Catalonia, Spain.

    Science.gov (United States)

    Ibáñez, Raquel; Moreno-Crespi, Judit; Sardà, Montserrat; Autonell, Josefina; Fibla, Montserrat; Gutiérrez, Cristina; Lloveras, Belen; Alejo, María; Català, Isabel; Alameda, Francesc; Casas, Miquel; Bosch, F Xavier; de Sanjosé, Silvia

    2012-01-26

    A protocol for cervical cancer screening among sexually active women 25 to 65 years of age was introduced in 2006 in Catalonia, Spain to increase coverage and to recommend a 3-year-interval between screening cytology. In addition, Human Papillomavirus (HPV) was offered as a triage test for women with a diagnosis of atypical squamous cells of undetermined significance (ASC-US). HPV testing was recommended within 3 months of ASC-US diagnosis. According to protocol, HPV negative women were referred to regular screening including a cytological exam every 3 years while HPV positive women were referred to colposcopy and closer follow-up. We evaluated the implementation of the protocol and the prediction of HPV testing as a triage tool for cervical intraepithelial lesions grade two or worse (CIN2+) in women with a cytological diagnosis of ASC-US. During 2007-08 a total of 611 women from five reference laboratories in Catalonia with a novel diagnosis of ASC-US were referred for high risk HPV (hrHPV) triage using high risk Hybrid Capture version 2. Using routine record linkage data, women were followed for 3 years to evaluate hrHPV testing efficacy for predicting CIN2+ cases. Logistic regression analysis was used to estimate the odds ratio for CIN2 +. Among the 611 women diagnosed with ASC-US, 493 (80.7%) had at least one follow-up visit during the study period. hrHPV was detected in 48.3% of the women at study entry (mean age 35.2 years). hrHPV positivity decreased with increasing age from 72.6% among women younger than 25 years to 31.6% in women older than 54 years (p < 0.01). At the end of the 3 years follow-up period, 37 women with a diagnosis of CIN2+ (18 CIN2, 16 CIN3, 2 cancers, and 1 with high squamous intraepithelial lesions--HSIL) were identified and all but one had a hrHPV positive test at study entry. Sensitivity to detect CIN2+ of hrHPV was 97.2% (95%confidence interval (CI) = 85.5-99.9) and specificity was 68.3% (95%CI = 63.1-73.2). The odds ratio for CIN2

  15. ASC-J9 Suppresses Castration-Resistant Prostate Cancer Growth through Degradation of Full-length and Splice Variant Androgen Receptors

    Directory of Open Access Journals (Sweden)

    Shinichi Yamashita

    2012-01-01

    Full Text Available Early studies suggested androgen receptor (AR splice variants might contribute to the progression of prostate cancer (PCa into castration resistance. However, the therapeutic strategy to target these AR splice variants still remains unresolved. Through tissue survey of tumors from the same patients before and after castration resistance, we found that the expression of AR3, a major AR splice variant that lacks the AR ligand-binding domain, was substantially increased after castration resistance development. The currently used antiandrogen, Casodex, showed little growth suppression in CWR22Rv1 cells. Importantly, we found that AR degradation enhancer ASC-J9 could degrade both full-length (fAR and AR3 in CWR22Rv1 cells as well as in C4-2 and C81 cells with addition of AR3. The consequences of such degradation of both fAR and AR3 might then result in the inhibition of AR transcriptional activity and cell growth in vitro. More importantly, suppression of AR3 specifically by short-hairpin AR3 or degradation of AR3 by ASC-J9 resulted in suppression of AR transcriptional activity and cell growth in CWR22Rv1-fARKD (fAR knockdown cells in which DHT failed to induce, suggesting the importance of targeting AR3. Finally, we demonstrated the in vivo therapeutic effects of ASC-J9 by showing the inhibition of PCa growth using the xenografted model of CWR22Rv1 cells orthotopically implanted into castrated nude mice with undetectable serum testosterone. These results suggested that targeting both fAR- and AR3-mediated PCa growth by ASC-J9 may represent the novel therapeutic approach to suppress castration-resistant PCa. Successful clinical trials targeting both fAR and AR3 may help us to battle castration-resistant PCa in the future.

  16. Validation of DNA-based identification software by computation of pedigree likelihood ratios.

    Science.gov (United States)

    Slooten, K

    2011-08-01

    Disaster victim identification (DVI) can be aided by DNA-evidence, by comparing the DNA-profiles of unidentified individuals with those of surviving relatives. The DNA-evidence is used optimally when such a comparison is done by calculating the appropriate likelihood ratios. Though conceptually simple, the calculations can be quite involved, especially with large pedigrees, precise mutation models etc. In this article we describe a series of test cases designed to check if software designed to calculate such likelihood ratios computes them correctly. The cases include both simple and more complicated pedigrees, among which inbred ones. We show how to calculate the likelihood ratio numerically and algebraically, including a general mutation model and possibility of allelic dropout. In Appendix A we show how to derive such algebraic expressions mathematically. We have set up these cases to validate new software, called Bonaparte, which performs pedigree likelihood ratio calculations in a DVI context. Bonaparte has been developed by SNN Nijmegen (The Netherlands) for the Netherlands Forensic Institute (NFI). It is available free of charge for non-commercial purposes (see www.dnadvi.nl for details). Commercial licenses can also be obtained. The software uses Bayesian networks and the junction tree algorithm to perform its calculations. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  17. Detection of Steganography-Producing Software Artifacts on Crime-Related Seized Computers

    Directory of Open Access Journals (Sweden)

    Asawaree Kulkarni

    2009-06-01

    Full Text Available Steganography is the art and science of hiding information within information so that an observer does not know that communication is taking place. Bad actors passing information using steganography are of concern to the national security establishment and law enforcement. An attempt was made to determine if steganography was being used by criminals to communicate information. Web crawling technology was used and images were downloaded from Web sites that were considered as likely candidates for containing information hidden using steganographic techniques. A detection tool was used to analyze these images. The research failed to demonstrate that steganography was prevalent on the public Internet. The probable reasons included the growth and availability of large number of steganography-producing tools and the limited capacity of the detection tools to cope with them. Thus, a redirection was introduced in the methodology and the detection focus was shifted from the analysis of the ‘product’ of the steganography-producing software; viz. the images, to the 'artifacts’ left by the steganography-producing software while it is being used to generate steganographic images. This approach was based on the concept of ‘Stego-Usage Timeline’. As a proof of concept, a sample set of criminal computers was scanned for the remnants of steganography-producing software. The results demonstrated that the problem of ‘the detection of the usage of steganography’ could be addressed by the approach adopted after the research redirection and that certain steganographic software was popular among the criminals. Thus, the contribution of the research was in demonstrating that the limitations of the tools based on the signature detection of steganographically altered images can be overcome by focusing the detection effort on detecting the artifacts of the steganography-producing tools.

  18. Cross-Platform Learning Media Development of Software Installation on Computer Engineering and Networking Expertise Package

    Directory of Open Access Journals (Sweden)

    Afis Pratama

    2018-03-01

    Full Text Available Software Installation is one of the important lessons that must be mastered by student of computer and network engineering expertise package. But there is a problem about the lack of attention and concentration of students in following the teaching and learning process in the subject of installation of the software. The matter must immediately find a solution. This research refers to the technology development that is always increasing. The technology can be used as a tool to support learning activities. Currently, all grade 10 students in public vocational high school (SMK 8 Semarang Indonesia already have a gadget, either a smartphone or a laptop and the intensity of usage is high enough. Based on this phenomenon, this research aims to create a learning media software installation that is cross-platform. It is practical and can be carried easily in a smartphone and a laptop that has different operating system. So that, this media is expected to improve learning outcomes, understanding and enthusiasm of the students in the software installation lesson.

  19. Software for computers in the safety systems of nuclear power stations. Identical with IEC 45A(Central Office)88. Draft. Software fuer Rechner im Sicherheitssystem von Kernkraftwerken. Identisch mit IEC 45A(CO)88. Entwurf

    Energy Technology Data Exchange (ETDEWEB)

    1986-01-01

    The basic principles for the design of nuclear instrumentation as specifically applied to the safety systems of nuclear power plants have been interpreted in existing standards as the IAEA ''Safety Guide 50-SG-D3'' with a view to hardwired systems. This publication has been developed to interprete these principles for the utilization of digital systems - multiprocessor distributed systems as well as larger scale central processor systems - in the safety systems of nuclear power plants. It is important to note that this document establishes no additional functional requirements for safety systems. Areas which have been dealt with because of the unique nature of digital computer systems especially the software are: a) Established hardware criterea as far as they affect the software with care taken to account for the high degree of interdependency between hardware and software. b) A general approach to software development to assure the production of the highly reliable software required. c) A general approach to software verification and computer system validation. d) Procedures for software maintenance, modification and configuration control. The systems are in accordance with the German KTA regulation KTA 3501.2. (orig./HP).

  20. Typical use of inverse dynamics in perceiving motion in autistic adults: Exploring computational principles of perception and action.

    Science.gov (United States)

    Takamuku, Shinya; Forbes, Paul A G; Hamilton, Antonia F de C; Gomi, Hiroaki

    2018-05-07

    There is increasing evidence for motor difficulties in many people with autism spectrum condition (ASC). These difficulties could be linked to differences in the use of internal models which represent relations between motions and forces/efforts. The use of these internal models may be dependent on the cerebellum which has been shown to be abnormal in autism. Several studies have examined internal computations of forward dynamics (motion from force information) in autism, but few have tested the inverse dynamics computation, that is, the determination of force-related information from motion information. Here, we examined this ability in autistic adults by measuring two perceptual biases which depend on the inverse computation. First, we asked participants whether they experienced a feeling of resistance when moving a delayed cursor, which corresponds to the inertial force of the cursor implied by its motion-both typical and ASC participants reported similar feelings of resistance. Second, participants completed a psychophysical task in which they judged the velocity of a moving hand with or without a visual cue implying inertial force. Both typical and ASC participants perceived the hand moving with the inertial cue to be slower than the hand without it. In both cases, the magnitude of the effects did not differ between the two groups. Our results suggest that the neural systems engaged in the inverse dynamics computation are preserved in ASC, at least in the observed conditions. Autism Res 2018. © 2018 International Society for Autism Research, Wiley Periodicals, Inc. We tested the ability to estimate force information from motion information, which arises from a specific "inverse dynamics" computation. Autistic adults and a matched control group reported feeling a resistive sensation when moving a delayed cursor and also judged a moving hand to be slower when it was pulling a load. These findings both suggest that the ability to estimate force information from

  1. Software Engineering for Portability.

    Science.gov (United States)

    Stanchev, Ivan

    1990-01-01

    Discussion of the portability of educational software focuses on the software design and development process. Topics discussed include levels of portability; the user-computer dialog; software engineering principles; design techniques for student performance records; techniques of courseware programing; and suggestions for further research and…

  2. Computer Support of Semantic Text Analysis of a Technical Specification on Designing Software

    OpenAIRE

    Zaboleeva-Zotova, Alla; Orlova, Yulia

    2009-01-01

    The given work is devoted to development of the computer-aided system of semantic text analysis of a technical specification. The purpose of this work is to increase efficiency of software engineering based on automation of semantic text analysis of a technical specification. In work it is offered and investigated a technique of the text analysis of a technical specification is submitted, the expanded fuzzy attribute grammar of a technical specification, intended for formaliza...

  3. 6th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing

    CERN Document Server

    2016-01-01

    This edited book presents scientific results of the 16th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD 2015) which was held on June 1 – 3, 2015 in Takamatsu, Japan. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. Research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them.

  4. 17th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing

    CERN Document Server

    SNPD 2016

    2016-01-01

    This edited book presents scientific results of the 17th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD 2016) which was held on May 30 - June 1, 2016 in Shanghai, China. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. Research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them.

  5. Personal computer security: part 1. Firewalls, antivirus software, and Internet security suites.

    Science.gov (United States)

    Caruso, Ronald D

    2003-01-01

    Personal computer (PC) security in the era of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) involves two interrelated elements: safeguarding the basic computer system itself and protecting the information it contains and transmits, including personal files. HIPAA regulations have toughened the requirements for securing patient information, requiring every radiologist with such data to take further precautions. Security starts with physically securing the computer. Account passwords and a password-protected screen saver should also be set up. A modern antivirus program can easily be installed and configured. File scanning and updating of virus definitions are simple processes that can largely be automated and should be performed at least weekly. A software firewall is also essential for protection from outside intrusion, and an inexpensive hardware firewall can provide yet another layer of protection. An Internet security suite yields additional safety. Regular updating of the security features of installed programs is important. Obtaining a moderate degree of PC safety and security is somewhat inconvenient but is necessary and well worth the effort. Copyright RSNA, 2003

  6. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  7. Software-based risk stratification of pulmonary adenocarcinomas manifesting as pure ground glass nodules on computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Nemec, Ursula [Vienna General Hospital, Medical University of Vienna, Department of Biomedical Imaging and Image-guided Therapy, Vienna (Austria); Heidinger, Benedikt H.; Bankier, Alexander A. [Harvard Medical School, Radiology, Beth Israel Deaconess Medical Center, Boston, MA (United States); Anderson, Kevin R.; VanderLaan, Paul A. [Harvard Medical School, Pathology, Beth Israel Deaconess Medical Center, Boston, MA (United States); Westmore, Michael S. [Imbio, Delafield, WI (United States)

    2018-01-15

    To assess the performance of the ''Computer-Aided Nodule Assessment and Risk Yield'' (CANARY) software in the differentiation and risk assessment of histological subtypes of lung adenocarcinomas manifesting as pure ground glass nodules on computed tomography (CT). 64 surgically resected and histologically proven adenocarcinomas manifesting as pure ground-glass nodules on CT were assessed using CANARY software, which classifies voxel-densities into three risk components (low, intermediate, and high risk). Differences in risk components between histological adenocarcinoma subtypes were analysed. To determine the optimal threshold reflecting the presence of an invasive focus, sensitivity, specificity, negative predictive value, and positive predictive value were calculated. 28/64 (44%) were adenocarcinomas in situ (AIS); 26/64 (41%) were minimally invasive adenocarcinomas (MIA); and 10/64 (16%) were invasive ACs (IAC). The software showed significant differences in risk components between histological subtypes (P<0.001-0.003). A relative volume of 45% or less of low-risk components was associated with histological invasiveness (specificity 100%, positive predictive value 100%). CANARY-based risk assessment of ACs manifesting as pure ground glass nodules on CT allows the differentiation of their histological subtypes. A threshold of 45% of low-risk components reflects invasiveness in these groups. (orig.)

  8. Software-based risk stratification of pulmonary adenocarcinomas manifesting as pure ground glass nodules on computed tomography

    International Nuclear Information System (INIS)

    Nemec, Ursula; Heidinger, Benedikt H.; Bankier, Alexander A.; Anderson, Kevin R.; VanderLaan, Paul A.; Westmore, Michael S.

    2018-01-01

    To assess the performance of the ''Computer-Aided Nodule Assessment and Risk Yield'' (CANARY) software in the differentiation and risk assessment of histological subtypes of lung adenocarcinomas manifesting as pure ground glass nodules on computed tomography (CT). 64 surgically resected and histologically proven adenocarcinomas manifesting as pure ground-glass nodules on CT were assessed using CANARY software, which classifies voxel-densities into three risk components (low, intermediate, and high risk). Differences in risk components between histological adenocarcinoma subtypes were analysed. To determine the optimal threshold reflecting the presence of an invasive focus, sensitivity, specificity, negative predictive value, and positive predictive value were calculated. 28/64 (44%) were adenocarcinomas in situ (AIS); 26/64 (41%) were minimally invasive adenocarcinomas (MIA); and 10/64 (16%) were invasive ACs (IAC). The software showed significant differences in risk components between histological subtypes (P<0.001-0.003). A relative volume of 45% or less of low-risk components was associated with histological invasiveness (specificity 100%, positive predictive value 100%). CANARY-based risk assessment of ACs manifesting as pure ground glass nodules on CT allows the differentiation of their histological subtypes. A threshold of 45% of low-risk components reflects invasiveness in these groups. (orig.)

  9. Global Software Development with Cloud Platforms

    Science.gov (United States)

    Yara, Pavan; Ramachandran, Ramaseshan; Balasubramanian, Gayathri; Muthuswamy, Karthik; Chandrasekar, Divya

    Offshore and outsourced distributed software development models and processes are facing challenges, previously unknown, with respect to computing capacity, bandwidth, storage, security, complexity, reliability, and business uncertainty. Clouds promise to address these challenges by adopting recent advances in virtualization, parallel and distributed systems, utility computing, and software services. In this paper, we envision a cloud-based platform that addresses some of these core problems. We outline a generic cloud architecture, its design and our first implementation results for three cloud forms - a compute cloud, a storage cloud and a cloud-based software service- in the context of global distributed software development (GSD). Our ”compute cloud” provides computational services such as continuous code integration and a compile server farm, ”storage cloud” offers storage (block or file-based) services with an on-line virtual storage service, whereas the on-line virtual labs represent a useful cloud service. We note some of the use cases for clouds in GSD, the lessons learned with our prototypes and identify challenges that must be conquered before realizing the full business benefits. We believe that in the future, software practitioners will focus more on these cloud computing platforms and see clouds as a means to supporting a ecosystem of clients, developers and other key stakeholders.

  10. Prediction of cervical intraepithelial neoplasia grade 2+ (CIN2+ using HPV DNA testing after a diagnosis of atypical squamous cell of undetermined significance (ASC-US in Catalonia, Spain

    Directory of Open Access Journals (Sweden)

    Ibáñez Raquel

    2012-01-01

    Full Text Available Abstract Background A protocol for cervical cancer screening among sexually active women 25 to 65 years of age was introduced in 2006 in Catalonia, Spain to increase coverage and to recommend a 3-year-interval between screening cytology. In addition, Human Papillomavirus (HPV was offered as a triage test for women with a diagnosis of atypical squamous cells of undetermined significance (ASC-US. HPV testing was recommended within 3 months of ASC-US diagnosis. According to protocol, HPV negative women were referred to regular screening including a cytological exam every 3 years while HPV positive women were referred to colposcopy and closer follow-up. We evaluated the implementation of the protocol and the prediction of HPV testing as a triage tool for cervical intraepithelial lesions grade two or worse (CIN2+ in women with a cytological diagnosis of ASC-US. Methods During 2007-08 a total of 611 women from five reference laboratories in Catalonia with a novel diagnosis of ASC-US were referred for high risk HPV (hrHPV triage using high risk Hybrid Capture version 2. Using routine record linkage data, women were followed for 3 years to evaluate hrHPV testing efficacy for predicting CIN2+ cases. Logistic regression analysis was used to estimate the odds ratio for CIN2 +. Results Among the 611 women diagnosed with ASC-US, 493 (80.7% had at least one follow-up visit during the study period. hrHPV was detected in 48.3% of the women at study entry (mean age 35.2 years. hrHPV positivity decreased with increasing age from 72.6% among women younger than 25 years to 31.6% in women older than 54 years (p At the end of the 3 years follow-up period, 37 women with a diagnosis of CIN2+ (18 CIN2, 16 CIN3, 2 cancers, and 1 with high squamous intraepithelial lesions -HSIL were identified and all but one had a hrHPV positive test at study entry. Sensitivity to detect CIN2+ of hrHPV was 97.2% (95%confidence interval (CI = 85.5-99.9 and specificity was 68.3% (95%CI

  11. Views on Software Testability

    OpenAIRE

    Shimeall, Timothy; Friedman, Michael; Chilenski, John; Voas, Jeffrey

    1994-01-01

    The field of testability is an active, well-established part of engineering of modern computer systems. However, only recently have technologies for software testability began to be developed. These technologies focus on accessing the aspects of software that improve or depreciate the ease of testing. As both the size of implemented software and the amount of effort required to test that software increase, so will the important of software testability technologies in influencing the softwa...

  12. ProjectQ: an open source software framework for quantum computing

    Directory of Open Access Journals (Sweden)

    Damian S. Steiger

    2018-01-01

    Full Text Available We introduce ProjectQ, an open source software effort for quantum computing. The first release features a compiler framework capable of targeting various types of hardware, a high-performance simulator with emulation capabilities, and compiler plug-ins for circuit drawing and resource estimation. We introduce our Python-embedded domain-specific language, present the features, and provide example implementations for quantum algorithms. The framework allows testing of quantum algorithms through simulation and enables running them on actual quantum hardware using a back-end connecting to the IBM Quantum Experience cloud service. Through extension mechanisms, users can provide back-ends to further quantum hardware, and scientists working on quantum compilation can provide plug-ins for additional compilation, optimization, gate synthesis, and layout strategies.

  13. Software Method for Computed Tomography Cylinder Data Unwrapping, Re-slicing, and Analysis

    Science.gov (United States)

    Roth, Don J.

    2013-01-01

    A software method has been developed that is applicable for analyzing cylindrical and partially cylindrical objects inspected using computed tomography (CT). This method involves unwrapping and re-slicing data so that the CT data from the cylindrical object can be viewed as a series of 2D sheets (or flattened onion skins ) in addition to a series of top view slices and 3D volume rendering. The advantages of viewing the data in this fashion are as follows: (1) the use of standard and specialized image processing and analysis methods is facilitated having 2D array data versus a volume rendering; (2) accurate lateral dimensional analysis of flaws is possible in the unwrapped sheets versus volume rendering; (3) flaws in the part jump out at the inspector with the proper contrast expansion settings in the unwrapped sheets; and (4) it is much easier for the inspector to locate flaws in the unwrapped sheets versus top view slices for very thin cylinders. The method is fully automated and requires no input from the user except proper voxel dimension from the CT experiment and wall thickness of the part. The software is available in 32-bit and 64-bit versions, and can be used with binary data (8- and 16-bit) and BMP type CT image sets. The software has memory (RAM) and hard-drive based modes. The advantage of the (64-bit) RAM-based mode is speed (and is very practical for users of 64-bit Windows operating systems and computers having 16 GB or more RAM). The advantage of the hard-drive based analysis is one can work with essentially unlimited-sized data sets. Separate windows are spawned for the unwrapped/re-sliced data view and any image processing interactive capability. Individual unwrapped images and un -wrapped image series can be saved in common image formats. More information is available at http://www.grc.nasa.gov/WWW/OptInstr/ NDE_CT_CylinderUnwrapper.html.

  14. The MUSOS (MUsic SOftware System) Toolkit: A computer-based, open source application for testing memory for melodies.

    Science.gov (United States)

    Rainsford, M; Palmer, M A; Paine, G

    2018-04-01

    Despite numerous innovative studies, rates of replication in the field of music psychology are extremely low (Frieler et al., 2013). Two key methodological challenges affecting researchers wishing to administer and reproduce studies in music cognition are the difficulty of measuring musical responses, particularly when conducting free-recall studies, and access to a reliable set of novel stimuli unrestricted by copyright or licensing issues. In this article, we propose a solution for these challenges in computer-based administration. We present a computer-based application for testing memory for melodies. Created using the software Max/MSP (Cycling '74, 2014a), the MUSOS (Music Software System) Toolkit uses a simple modular framework configurable for testing common paradigms such as recall, old-new recognition, and stem completion. The program is accompanied by a stimulus set of 156 novel, copyright-free melodies, in audio and Max/MSP file formats. Two pilot tests were conducted to establish the properties of the accompanying stimulus set that are relevant to music cognition and general memory research. By using this software, a researcher without specialist musical training may administer and accurately measure responses from common paradigms used in the study of memory for music.

  15. Reviews, Software.

    Science.gov (United States)

    Science Teacher, 1988

    1988-01-01

    Reviews two computer software packages for use in physical science, physics, and chemistry classes. Includes "Physics of Model Rocketry" for Apple II, and "Black Box" for Apple II and IBM compatible computers. "Black Box" is designed to help students understand the concept of indirect evidence. (CW)

  16. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software

    Energy Technology Data Exchange (ETDEWEB)

    Fenske, K.R.

    1991-11-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization Plans for Word Processors, Personal Computers, Workstations, and Associated Software to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference that documents the plans of each organization for office automation, identifies appropriate planners and other contact people in those organizations, and encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan.

  17. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software

    Energy Technology Data Exchange (ETDEWEB)

    Fenske, K.R.; Rockwell, V.S.

    1992-08-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization plans for Word Processors, Personal Computers, Workstations, and Associated Software (ANL/TM, Revision 4) to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference document that (1) documents the plans of each organization for office automation, (2) identifies appropriate planners and other contact people in those organizations and (3) encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations (ANL/TM 458) and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan (ANL/TM 466).

  18. Lecture 2: Software Security

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    Computer security has been an increasing concern for IT professionals for a number of years, yet despite all the efforts, computer systems and networks remain highly vulnerable to attacks of different kinds. Design flaws and security bugs in the underlying software are among the main reasons for this. This lecture addresses the following question: how to create secure software? The lecture starts with a definition of computer security and an explanation of why it is so difficult to achieve. It then introduces the main security principles (like least-privilege, or defense-in-depth) and discusses security in different phases of the software development cycle. The emphasis is put on the implementation part: most common pitfalls and security bugs are listed, followed by advice on best practice for security development, testing and deployment. Sebastian Lopienski is CERN’s deputy Computer Security Officer. He works on security strategy and policies; offers internal consultancy and audit services; develops and ...

  19. Advances in software science and technology

    CERN Document Server

    Hikita, Teruo; Kakuda, Hiroyasu

    1993-01-01

    Advances in Software Science and Technology, Volume 4 provides information pertinent to the advancement of the science and technology of computer software. This book discusses the various applications for computer systems.Organized into two parts encompassing 10 chapters, this volume begins with an overview of the historical survey of programming languages for vector/parallel computers in Japan and describes compiling methods for supercomputers in Japan. This text then explains the model of a Japanese software factory, which is presented by the logical configuration that has been satisfied by

  20. GPU-FS-kNN: a software tool for fast and scalable kNN computation using GPUs.

    Directory of Open Access Journals (Sweden)

    Ahmed Shamsul Arefin

    Full Text Available BACKGROUND: The analysis of biological networks has become a major challenge due to the recent development of high-throughput techniques that are rapidly producing very large data sets. The exploding volumes of biological data are craving for extreme computational power and special computing facilities (i.e. super-computers. An inexpensive solution, such as General Purpose computation based on Graphics Processing Units (GPGPU, can be adapted to tackle this challenge, but the limitation of the device internal memory can pose a new problem of scalability. An efficient data and computational parallelism with partitioning is required to provide a fast and scalable solution to this problem. RESULTS: We propose an efficient parallel formulation of the k-Nearest Neighbour (kNN search problem, which is a popular method for classifying objects in several fields of research, such as pattern recognition, machine learning and bioinformatics. Being very simple and straightforward, the performance of the kNN search degrades dramatically for large data sets, since the task is computationally intensive. The proposed approach is not only fast but also scalable to large-scale instances. Based on our approach, we implemented a software tool GPU-FS-kNN (GPU-based Fast and Scalable k-Nearest Neighbour for CUDA enabled GPUs. The basic approach is simple and adaptable to other available GPU architectures. We observed speed-ups of 50-60 times compared with CPU implementation on a well-known breast microarray study and its associated data sets. CONCLUSION: Our GPU-based Fast and Scalable k-Nearest Neighbour search technique (GPU-FS-kNN provides a significant performance improvement for nearest neighbour computation in large-scale networks. Source code and the software tool is available under GNU Public License (GPL at https://sourceforge.net/p/gpufsknn/.

  1. Advanced transport operating system software upgrade: Flight management/flight controls software description

    Science.gov (United States)

    Clinedinst, Winston C.; Debure, Kelly R.; Dickson, Richard W.; Heaphy, William J.; Parks, Mark A.; Slominski, Christopher J.; Wolverton, David A.

    1988-01-01

    The Flight Management/Flight Controls (FM/FC) software for the Norden 2 (PDP-11/70M) computer installed on the NASA 737 aircraft is described. The software computes the navigation position estimates, guidance commands, those commands to be issued to the control surfaces to direct the aircraft in flight based on the modes selected on the Advanced Guidance Control System (AGSC) mode panel, and the flight path selected via the Navigation Control/Display Unit (NCDU).

  2. The effects of computer-aided design software on engineering students' spatial visualisation skills

    Science.gov (United States)

    Kösa, Temel; Karakuş, Fatih

    2018-03-01

    The purpose of this study was to determine the influence of computer-aided design (CAD) software-based instruction on the spatial visualisation skills of freshman engineering students in a computer-aided engineering drawing course. A quasi-experimental design was applied, using the Purdue Spatial Visualization Test-Visualization of Rotations (PSVT:R) for both the pre- and the post-test. The participants were 116 freshman students in the first year of their undergraduate programme in the Department of Mechanical Engineering at a university in Turkey. A total of 72 students comprised the experimental group; they were instructed with CAD-based activities in an engineering drawing course. The control group consisted of 44 students who did not attend this course. The results of the study showed that a CAD-based engineering drawing course had a positive effect on developing engineering students' spatial visualisation skills. Additionally, the results of the study showed that spatial visualisation skills can be a predictor for success in a computer-aided engineering drawing course.

  3. Software platform virtualization in chemistry research and university teaching.

    Science.gov (United States)

    Kind, Tobias; Leamy, Tim; Leary, Julie A; Fiehn, Oliver

    2009-11-16

    Modern chemistry laboratories operate with a wide range of software applications under different operating systems, such as Windows, LINUX or Mac OS X. Instead of installing software on different computers it is possible to install those applications on a single computer using Virtual Machine software. Software platform virtualization allows a single guest operating system to execute multiple other operating systems on the same computer. We apply and discuss the use of virtual machines in chemistry research and teaching laboratories. Virtual machines are commonly used for cheminformatics software development and testing. Benchmarking multiple chemistry software packages we have confirmed that the computational speed penalty for using virtual machines is low and around 5% to 10%. Software virtualization in a teaching environment allows faster deployment and easy use of commercial and open source software in hands-on computer teaching labs. Software virtualization in chemistry, mass spectrometry and cheminformatics is needed for software testing and development of software for different operating systems. In order to obtain maximum performance the virtualization software should be multi-core enabled and allow the use of multiprocessor configurations in the virtual machine environment. Server consolidation, by running multiple tasks and operating systems on a single physical machine, can lead to lower maintenance and hardware costs especially in small research labs. The use of virtual machines can prevent software virus infections and security breaches when used as a sandbox system for internet access and software testing. Complex software setups can be created with virtual machines and are easily deployed later to multiple computers for hands-on teaching classes. We discuss the popularity of bioinformatics compared to cheminformatics as well as the missing cheminformatics education at universities worldwide.

  4. Antocianinas, ácido ascórbico, polifenoles totales y actividad antioxidante, en la cáscara de camu-camu (Myrciaria dubia (H.B.K McVaugh Antocianinas, ácido ascórbico, polifenóis totais e atividade antioxidante na casca do camu-camu (Myrciaria dubia (H.B.K McVaugh

    Directory of Open Access Journals (Sweden)

    Juan Edson Villanueva-Tiburcio

    2010-05-01

    Full Text Available Este trabajo fue realizado en la UNAS, Tingo María, Perú. Los objetivos fueron evaluar el contenido de antocianinas, ácido ascórbico, y polifenoles totales, en la cáscara fresca y seca de camu-camu (Myrciaria dubia (H.B.K McVaugh en diferentes estados de madurez; evaluar la actividad antioxidante en la cáscara seca, usando diferentes tipos de radicales (DPPH, ABTS+ y Peroxilo y correlacionar el valor de ácido ascórbico y polifenoles totales con la actividad antioxidante. La extracción fue realizada en medio acuoso, y los resultados de las evaluaciones de cada experimento fueron analizados por un diseño completamente al azar (DCA, según la prueba de t-student (p A pesquisa foi realizada na UNAS em Tingo Maria, Peru, teve como objetivos: avaliar o teor de antocianinas, ácido ascórbico e polifenóis totais, na casca fresca e na casca seca do camu-camu (Myrciaria dubia (HBK McVaugh em diferentes tempos de maturação; avaliar a atividade antioxidante na casca seca utilizando diferentes tipos de radicais (DPPH, ABTS+ e Peroxilo e correlacionar o teor de ácido ascórbico e polifenóis totais com a atividade antioxidante. A extraç��o foi realizada em meio aquoso, os resultados das avaliações de cada experiência são analisados no delineamento inteiramente casualizado (DIC, pelo teste de t-student (p < 0,05. O extrato da casca da amostra madura fresca apresentou as concentrações mais elevadas de ácido ascórbico e de antocianinas em relação a meio madura e verde, com 21,95 mg.g-1 de casca e 46,42 mg.L-1 de cianidin-3-glucosídeo, respectivamente, enquanto que o extrato da casca seca meio madura apresentou o maior teor de ácido ascórbico em relação a madura e verde (53,49 mg.g-1 e de polifenóis totais: 7,70 mg Ác. Gálico/g. A maior atividade antioxidante foi encontrada no extrato da casca seca da amostra meio madura com IC50 = 46,20; 20,25 e 8,30 μg.mL-1, em comparação com DPPH radical, ABTS+ e Perox

  5. Prepare for X-Win32 - the new X11 server software for Windows computers

    CERN Multimedia

    IT Department

    2011-01-01

    Starnet X-Win32 will replace Exceed as the X11 Server software on Windows computers by February 2012. X11 Server software allows a Windows user to have a graphical user interface on a remote Linux server. This change, initially motivated by a significant change of license conditions for Exceed, brings an easier integration of Windows and Linux logon mechanisms. At the same time, X-Win32 addresses the common use cases while providing a more intuitive configuration interface. CERN Predefined Connections will be available as before. They offer an easy way of starting applications on LXPLUS using PuTTY or starting the KDE, GNOME or ICE window managers. Since X-Win32 is better integrated with SSH and CERN Kerberos compared to Exceed, it is much simpler to set up secure access to Linux services. The decision to choose X-Win32 as the new X11 software resulted from an evaluation that involved various user communities and support teams. More information, including the documented use cases, is available at https://...

  6. Development and application of a complex numerical model and software for the computation of dose conversion factors for radon progenies.

    Science.gov (United States)

    Farkas, Árpád; Balásházy, Imre

    2015-04-01

    A more exact determination of dose conversion factors associated with radon progeny inhalation was possible due to the advancements in epidemiological health risk estimates in the last years. The enhancement of computational power and the development of numerical techniques allow computing dose conversion factors with increasing reliability. The objective of this study was to develop an integrated model and software based on a self-developed airway deposition code, an own bronchial dosimetry model and the computational methods accepted by International Commission on Radiological Protection (ICRP) to calculate dose conversion coefficients for different exposure conditions. The model was tested by its application for exposure and breathing conditions characteristic of mines and homes. The dose conversion factors were 8 and 16 mSv WLM(-1) for homes and mines when applying a stochastic deposition model combined with the ICRP dosimetry model (named PM-A model), and 9 and 17 mSv WLM(-1) when applying the same deposition model combined with authors' bronchial dosimetry model and the ICRP bronchiolar and alveolar-interstitial dosimetry model (called PM-B model). User friendly software for the computation of dose conversion factors has also been developed. The software allows one to compute conversion factors for a large range of exposure and breathing parameters and to perform sensitivity analyses. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. 14th ACIS/IEEE International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing

    CERN Document Server

    Studies in Computational Intelligence : Volume 492

    2013-01-01

    This edited book presents scientific results of the 14th ACIS/IEEE International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD 2013), held in Honolulu, Hawaii, USA on July 1-3, 2013. The aim of this conference was to bring together scientists, engineers, computer users, and students to share their experiences and exchange new ideas, research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected the 17 outstanding papers from those papers accepted for presentation at the conference.  

  8. 15th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing

    CERN Document Server

    2015-01-01

    This edited book presents scientific results of 15th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD 2014) held on June 30 – July 2, 2014 in Las Vegas Nevada, USA. The aim of this conference was to bring together scientists, engineers, computer users, and students to share their experiences and exchange new ideas, research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected the 13 outstanding papers from those papers accepted for presentation at the conference.

  9. Software Components and Formal Methods from a Computational Viewpoint

    OpenAIRE

    Lambertz, Christian

    2012-01-01

    Software components and the methodology of component-based development offer a promising approach to master the design complexity of huge software products because they separate the concerns of software architecture from individual component behavior and allow for reusability of components. In combination with formal methods, the specification of a formal component model of the later software product or system allows for establishing and verifying important system properties in an automatic a...

  10. Determinação das constantes cinéticas de degradação do ácido ascórbico em purê de pêssego: efeito da temperatura e concentração

    OpenAIRE

    Toralles,Ricardo Peraça; Vendruscolo,João Luiz; Vendruscolo,Claire Tondo; Del Pino,Francisco Augusto Burkert; Antunes,Pedro Luiz

    2008-01-01

    O ácido ascórbico, vitamina C, é usado extensivamente na indústria de alimentos, não só devido ao seu valor nutricional, mas devido a suas contribuições funcionais na qualidade do produto. Existem muitos estudos sobre a estabilidade cinética do ácido ascórbico em bebidas, mas nenhum estudo foi encontrado sobre as constantes cinéticas de degradação do ácido ascórbico adicionado em purê de pêssego. Neste trabalho, estudou-se a cinética de degradação do ácido ascórbico em purê de pêssego da cult...

  11. FASTBUS software status

    International Nuclear Information System (INIS)

    Gustavson, D.B.

    1980-10-01

    Computer software will be needed in addition to the mechanical, electrical, protocol and timing specifications of the FASTBUS, in order to facilitate the use of this flexible new multiprocessor and multisegment data acquisition and processing system. Software considerations have been important in the FASTBUS design, but standard subroutines and recommended algorithms will be needed as the FASTBUS comes into use. This paper summarizes current FASTBUS software projects, goals and status

  12. Introducing the RadBioStat Educational Software: Computer-Assisted Teaching of the Random Nature of Cell Killing

    Directory of Open Access Journals (Sweden)

    Safari A

    2014-06-01

    Full Text Available The interaction of radiation with cells and tissues has a random nature. Therefore, understanding the random nature of cell killing that is determined by Poisson distribution statistics is an essential point in education of radiation biology. RadBioStat is a newly developed educational MATLAB-based software designed for computer-assisted learning of the target theory in radiation biology. Although its potential applications is developing rapidly, currently RadBioStat software can be a useful tool in computerassisted education of radiobiological models such as single target single hit, multiple target single hit and multiple target multiple hit. Scholars’ feedback is valuable to the producers of this software and help them continuously improve this product, add new features and increase its desirability and functionality.

  13. Fundamentals of multicore software development

    CERN Document Server

    Pankratius, Victor; Tichy, Walter F

    2011-01-01

    With multicore processors now in every computer, server, and embedded device, the need for cost-effective, reliable parallel software has never been greater. By explaining key aspects of multicore programming, Fundamentals of Multicore Software Development helps software engineers understand parallel programming and master the multicore challenge. Accessible to newcomers to the field, the book captures the state of the art of multicore programming in computer science. It covers the fundamentals of multicore hardware, parallel design patterns, and parallel programming in C++, .NET, and Java. It

  14. Optimization of Antivirus Software

    OpenAIRE

    Catalin BOJA; Adrian VISOIU

    2007-01-01

    The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyze...

  15. Advanced Simulation and Computing Business Plan

    Energy Technology Data Exchange (ETDEWEB)

    Rummel, E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-09

    To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsible for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.

  16. The Economics of Educational Software Portability.

    Science.gov (United States)

    Oliveira, Joao Batista Araujo e

    1990-01-01

    Discusses economic issues that affect the portability of educational software. Topics discussed include economic reasons for portability, including cost effectiveness; the nature and behavior of educational computer software markets; the role of producers, buyers, and consumers; potential effects of government policies; computer piracy; and…

  17. Efeito do ácido ascórbico em massa de pão na presença do ácido tânico - doi: 10.4025/actascitechnol.v32i2.5290

    Directory of Open Access Journals (Sweden)

    Ana Leticia Gomes Saraiva

    2010-07-01

    Full Text Available Pães de farinha de trigo com adição do agente oxidante ácido ascórbico foram elaborados, tentando-se encontrar a concentração de ácido tânico que melhor retivesse o ácido ascórbico, sem afetar significativamente as características reológicas e os principais atributos sensoriais do pão. Os pães foram elaborados com a formulação de pão-padrão. Na primeira etapa, determinou-se a concentração de ácido ascórbico que melhor proporcionasse estabilidade à massa de pão-padrão (300 ppm. Na etapa seguinte, foram utilizadas na formulação da massa-padrão três concentrações diferentes de ácido tânico: 0,10; 0,20 e 0,30%, com a finalidade de se verificar a concentração que melhor retivesse o ácido ascórbico. Os pães foram elaborados com a formulação-padrão. Após o resfriamento dos pães, foi avaliada a retenção de ácido ascórbico, presente no pão de acordo com a concentração previamente determinada. Os resultados indicaram que o pão elaborado com a massa de pão-padrão (300 ppm de ácido ascórbico apresentou concentração de ácido ascórbico de 41,50 mg 100 g-1 de pão.

  18. System Software and Tools for High Performance Computing Environments: A report on the findings of the Pasadena Workshop, April 14--16, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Sterling, T. [Universities Space Research Association, Washington, DC (United States); Messina, P. [Jet Propulsion Lab., Pasadena, CA (United States); Chen, M. [Yale Univ., New Haven, CT (United States)] [and others

    1993-04-01

    The Pasadena Workshop on System Software and Tools for High Performance Computing Environments was held at the Jet Propulsion Laboratory from April 14 through April 16, 1992. The workshop was sponsored by a number of Federal agencies committed to the advancement of high performance computing (HPC) both as a means to advance their respective missions and as a national resource to enhance American productivity and competitiveness. Over a hundred experts in related fields from industry, academia, and government were invited to participate in this effort to assess the current status of software technology in support of HPC systems. The overall objectives of the workshop were to understand the requirements and current limitations of HPC software technology and to contribute to a basis for establishing new directions in research and development for software technology in HPC environments. This report includes reports written by the participants of the workshop`s seven working groups. Materials presented at the workshop are reproduced in appendices. Additional chapters summarize the findings and analyze their implications for future directions in HPC software technology development.

  19. Observaciones histopatologicas de juveniles penaeus vannamei sometidos a dietas artificiales con diferentes concentraciones de una sal de ácido ascórbico (vitamina c)

    OpenAIRE

    Vera Muñoz, L.

    1995-01-01

    Observaciones histopatologicas de juveniles Penaeus Vannamei sometidos a dietas artificiales con diferentes concentraciones de una sal de ácido ascórbico (vitamina C) Se realizaron análisis histológicos y comprobación mediante métodos histoquímicos de la presencia de melanina, a juveniles Penaeus vannamei sometidos a cinco dietas experimentales con diferentes concentraciones de una sal de L-Ascorbato-2-Fosfato de Mg (APM) usada como fuente de ácido ascórbico (AA).

  20. Software on the Peregrine System | High-Performance Computing | NREL

    Science.gov (United States)

    on the Peregrine System Software on the Peregrine System NREL maintains a variety of applications environment modules for use on Peregrine. Applications View list of software applications by name and research area/discipline. Libraries View list of software libraries available for linking and loading

  1. Copy-Right for Software and Computer Games: Strategies and Challenges

    Directory of Open Access Journals (Sweden)

    Hojatollah Ayoubi

    2009-11-01

    Full Text Available Copy-right has been initially used in cultural and art industries. From that time there have been two different approaches to the matter: the commercial-economic approach which is concerned with the rights of suppliers and investors; and the other approach, the cultural one, which is especially concerned with the rights of author. First approach is rooted in Anglo-American countries, while the other is originally French. Expansion of the computer market, and separating software and hardware markets caused to the so-called velvet-rubbery, which refers to the illegal reproduction in the market. Therefore, there were some struggles all over the world to protect rights of their producers. In present study, beside the domestic and international difficulties these strategies would encounter, this article has reviewed different strategies to face this challenge.

  2. A Survey of Software Infrastructures and Frameworks for Ubiquitous Computing

    Directory of Open Access Journals (Sweden)

    Christoph Endres

    2005-01-01

    Full Text Available In this survey, we discuss 29 software infrastructures and frameworks which support the construction of distributed interactive systems. They range from small projects with one implemented prototype to large scale research efforts, and they come from the fields of Augmented Reality (AR, Intelligent Environments, and Distributed Mobile Systems. In their own way, they can all be used to implement various aspects of the ubiquitous computing vision as described by Mark Weiser [60]. This survey is meant as a starting point for new projects, in order to choose an existing infrastructure for reuse, or to get an overview before designing a new one. It tries to provide a systematic, relatively broad (and necessarily not very deep overview, while pointing to relevant literature for in-depth study of the systems discussed.

  3. Software Defined Resource Orchestration System for Multitask Application in Heterogeneous Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Qi Qi

    2016-01-01

    Full Text Available The mobile cloud computing (MCC that combines mobile computing and cloud concept takes wireless access network as the transmission medium and uses mobile devices as the client. When offloading the complicated multitask application to the MCC environment, each task executes individually in terms of its own computation, storage, and bandwidth requirement. Due to user’s mobility, the provided resources contain different performance metrics that may affect the destination choice. Nevertheless, these heterogeneous MCC resources lack integrated management and can hardly cooperate with each other. Thus, how to choose the appropriate offload destination and orchestrate the resources for multitask is a challenge problem. This paper realizes a programming resource provision for heterogeneous energy-constrained computing environments, where a software defined controller is responsible for resource orchestration, offload, and migration. The resource orchestration is formulated as multiobjective optimal problem that contains the metrics of energy consumption, cost, and availability. Finally, a particle swarm algorithm is used to obtain the approximate optimal solutions. Simulation results show that the solutions for all of our studied cases almost can hit Pareto optimum and surpass the comparative algorithm in approximation, coverage, and execution time.

  4. Clinical significance of HPV DNA cotesting in Korean women with ASCUS or ASC-H.

    Science.gov (United States)

    Lee, Sanghoon; Kim, Jae Won; Hong, Jin Hwa; Song, Jae Yun; Lee, Jae Kwan; Kim, In Sun; Lee, Nak Woo

    2014-12-01

    The purpose of this study was to evaluate the clinical significance of Human papillomavirus (HPV) DNA cotesting in Korean women with abnormal Papanicolaou (Pap) smear results based on colposcopic pathology. A total of 1012 women underwent liquid-based Pap smears and hybrid capture II HPV DNA tests followed by colposcopy at the Korea University Hospital from January 2007 to May 2012. Of these women, 832 women were included in this retrospective study. The mean patient age was 45.4 ± 13.7 years (range:15-80). The distribution of Pap smear results was normal (4.7%), atypical squamous cells of uncertain significance (ASCUS) (42.1%), low-grade squamous intraepithelial lesion (26.8%), ASC-H (7.0%), and high-grade squamous intraepithelial lesion (HSIL) (19.5%). In women with ASCUS, none of the 87 HPV-negative had ≥cervical intraepithelial neoplasia (CIN2) (P age groups: ASCUS and ASC-H furnish healthcare providers with informative data. There is a lower proportion of ≥CIN2 in HPV-negative women and a higher proportion of ≥CIN2 in HPV-positive. When HPV data were further evaluated by age group, the risk of ≥CIN2 was lower in HPV-negative women, especially in women ≥30. © 2014 Wiley Periodicals, Inc.

  5. Effects of Metal Micro and Nano-Particles on hASCs: An In Vitro Model.

    Science.gov (United States)

    Palombella, Silvia; Pirrone, Cristina; Rossi, Federica; Armenia, Ilaria; Cherubino, Mario; Valdatta, Luigi; Raspanti, Mario; Bernardini, Giovanni; Gornati, Rosalba

    2017-08-03

    As the knowledge about the interferences of nanomaterials on human staminal cells are scarce and contradictory, we undertook a comparative multidisciplinary study based on the size effect of zero-valent iron, cobalt, and nickel microparticles (MPs) and nanoparticles (NPs) using human adipose stem cells (hASCs) as a model, and evaluating cytotoxicity, morphology, cellular uptake, and gene expression. Our results suggested that the medium did not influence the cell sensitivity but, surprisingly, the iron microparticles (FeMPs) resulted in being toxic. These data were supported by modifications in mRNA expression of some genes implicated in the inflammatory response. Microscopic analysis confirmed that NPs, mainly internalized by endocytosis, persist in the vesicles without any apparent cell damage. Conversely, MPs are not internalized, and the effects on hASCs have to be ascribed to the release of ions in the culture medium, or to the reduced oxygen and nutrient exchange efficiency due to the presence of MP agglomerating around the cells. Notwithstanding the results depicting a heterogeneous scene that does not allow drawing a general conclusion, this work reiterates the importance of comparative investigations on MPs, NPs, and corresponding ions, and the need to continue the thorough verification of NP and MP innocuousness to ensure unaffected stem cell physiology and differentiation.

  6. Assuring Software Reliability

    Science.gov (United States)

    2014-08-01

    technologies and processes to achieve a required level of confidence that software systems and services function in the intended manner. 1.3 Security Example...that took three high-voltage lines out of service and a software fail- ure (a race condition3) that disabled the computing service that notified the... service had failed. Instead of analyzing the details of the alarm server failure, the reviewers asked why the following software assurance claim had

  7. Software of the BESM-6 computer for automatic image processing from liquid-hydrogen bubble chambers

    International Nuclear Information System (INIS)

    Grebenikov, E.A.; Kiosa, M.N.; Kobzarev, K.K.; Kuznetsova, N.A.; Mironov, S.V.; Nasonova, L.P.

    1978-01-01

    A set of programs, which is used in ''road guidance'' mode on the BESM-6 computer to process picture information taken in liquid hydrogen bubble chambers is discussed. This mode allows the system to process data from an automatic scanner (AS) taking into account the results of manual scanning. The system hardware includes: an automatic scanner, an M-6000 mini-controller and a BESM-6 computer. Software is functionally divided into the following units: computation of event mask parameters and generation . of data files controlling the AS; front-end processing of data coming from the AS; filtering of track data; simulation of AS operation and gauging of the AS reference system. To speed up the overall performance, programs which receive and decode data, coming from the AS via the M-6000 controller and the data link to the BESM-6 computer, are written in machine language

  8. PCG: A software package for the iterative solution of linear systems on scalar, vector and parallel computers

    Energy Technology Data Exchange (ETDEWEB)

    Joubert, W. [Los Alamos National Lab., NM (United States); Carey, G.F. [Univ. of Texas, Austin, TX (United States)

    1994-12-31

    A great need exists for high performance numerical software libraries transportable across parallel machines. This talk concerns the PCG package, which solves systems of linear equations by iterative methods on parallel computers. The features of the package are discussed, as well as techniques used to obtain high performance as well as transportability across architectures. Representative numerical results are presented for several machines including the Connection Machine CM-5, Intel Paragon and Cray T3D parallel computers.

  9. Computational Homology for Software Validation

    Science.gov (United States)

    2015-03-01

    involving compound data-types. 15. SUBJECT TERMS Abstract datatypes , convergence structure, topological methods, specification logics, hybrid software...composite datatype values are networks strewn through the device’s memory; think of a variable whose type is an array of balanced trees. Some means...structural in nature, is required to rigorously specify the evolution of composite states involving non-numerical, non-metric components. Composite datatypes

  10. Effects of Metal Micro and Nano-Particles on hASCs: An In Vitro Model

    OpenAIRE

    Palombella, Silvia; Pirrone, Cristina; Rossi, Federica; Armenia, Ilaria; Cherubino, Mario; Valdatta, Luigi; Raspanti, Mario; Bernardini, Giovanni; Gornati, Rosalba

    2017-01-01

    As the knowledge about the interferences of nanomaterials on human staminal cells are scarce and contradictory, we undertook a comparative multidisciplinary study based on the size effect of zero-valent iron, cobalt, and nickel microparticles (MPs) and nanoparticles (NPs) using human adipose stem cells (hASCs) as a model, and evaluating cytotoxicity, morphology, cellular uptake, and gene expression. Our results suggested that the medium did not influence the cell sensitivity but, surprisingly...

  11. Software reliability and safety in nuclear reactor protection systems

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence, J.D. [Lawrence Livermore National Lab., CA (United States)

    1993-11-01

    Planning the development, use and regulation of computer systems in nuclear reactor protection systems in such a way as to enhance reliability and safety is a complex issue. This report is one of a series of reports from the Computer Safety and Reliability Group, Lawrence Livermore that investigates different aspects of computer software in reactor National Laboratory, that investigates different aspects of computer software in reactor protection systems. There are two central themes in the report, First, software considerations cannot be fully understood in isolation from computer hardware and application considerations. Second, the process of engineering reliability and safety into a computer system requires activities to be carried out throughout the software life cycle. The report discusses the many activities that can be carried out during the software life cycle to improve the safety and reliability of the resulting product. The viewpoint is primarily that of the assessor, or auditor.

  12. Software reliability and safety in nuclear reactor protection systems

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1993-11-01

    Planning the development, use and regulation of computer systems in nuclear reactor protection systems in such a way as to enhance reliability and safety is a complex issue. This report is one of a series of reports from the Computer Safety and Reliability Group, Lawrence Livermore that investigates different aspects of computer software in reactor National Laboratory, that investigates different aspects of computer software in reactor protection systems. There are two central themes in the report, First, software considerations cannot be fully understood in isolation from computer hardware and application considerations. Second, the process of engineering reliability and safety into a computer system requires activities to be carried out throughout the software life cycle. The report discusses the many activities that can be carried out during the software life cycle to improve the safety and reliability of the resulting product. The viewpoint is primarily that of the assessor, or auditor

  13. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    Science.gov (United States)

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  14. Software life cycle methodologies and environments

    Science.gov (United States)

    Fridge, Ernest

    1991-01-01

    Products of this project will significantly improve the quality and productivity of Space Station Freedom Program software processes by: improving software reliability and safety; and broadening the range of problems that can be solved with computational solutions. Projects brings in Computer Aided Software Engineering (CASE) technology for: Environments such as Engineering Script Language/Parts Composition System (ESL/PCS) application generator, Intelligent User Interface for cost avoidance in setting up operational computer runs, Framework programmable platform for defining process and software development work flow control, Process for bringing CASE technology into an organization's culture, and CLIPS/CLIPS Ada language for developing expert systems; and methodologies such as Method for developing fault tolerant, distributed systems and a method for developing systems for common sense reasoning and for solving expert systems problems when only approximate truths are known.

  15. Survey and analyses of computer software usage in Calabar ...

    African Journals Online (AJOL)

    This work is to find out the most used software and the type of jobs mostly done. A descriptive analysis using simple percentages revealed that word processing software is the most used software followed by graphics, database and accounting in a decreasing order respectively. A comparative examination of the use of the ...

  16. ORNL's DCAL software package

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    2007-01-01

    Oak Ridge National Laboratory has released its Dose and Risk Calculation software, DCAL. DCAL, developed with the support of the U.S. Environmental Protection Agency, consists of a series of computational modules, driven in either an interactive or a batch mode for computation of dose and risk coefficients from intakes of radionuclides or exposure to radionuclides in environmental media. The software package includes extensive libraries of biokinetic and dosimetric data that represent the current state of the art. The software has unique capability for addressing intakes of radionuclides by non-adults. DCAL runs as 32-bit extended DOS and console applications under Windows 98/NT/2000/XP. It is intended for users familiar with the basic elements of computational radiation dosimetry. Components of DCAL have been used to prepare U.S. Environmental Protection Agency's Federal Guidance Reports 12 and 13 and several publications of the International Commission on Radiological Protection. (author)

  17. ASC ATDM Level 2 Milestone #5325: Asynchronous Many-Task Runtime System Analysis and Assessment for Next Generation Platforms.

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Gavin Matthew; Bettencourt, Matthew Tyler; Bova, Steven W.; Franko, Ken; Gamell, Marc; Grant, Ryan; Hammond, Simon David; Hollman, David S; Knight, Samuel; Kolla, Hemanth; Lin, Paul; Olivier, Stephen Lecler; Sjaardema, Gregory D.; Slattengren, Nicole Lemaster; Teranishi, Keita; Wilke, Jeremiah J; Bennett, Janine Camille; Clay, Robert L.; Kale, Laxkimant; Jain, Nikhil; Mikida, Eric; Aiken, Alex; Bauer, Michael; Lee, Wonchan; Slaughter, Elliott; Treichler, Sean; Berzins, Martin; Harman, Todd; Humphreys, Alan; Schmidt, John; Sunderland, Dan; Mccormick, Pat; Gutierrez, Samuel; Shulz, Martin; Gamblin, Todd; Bremer, Peer-Timo

    2015-09-01

    This report provides in-depth information and analysis to help create a technical road map for developing next- generation Orogramming mocleN and runtime systemsl that support Advanced Simulation and Computing (ASC) work- load requirements. The focus herein is on 4synchronous many-task (AMT) model and runtime systems, which are of great interest in the context of "Oriascale7 computing, as they hold the promise to address key issues associated with future extreme-scale computer architectures. This report includes a thorough qualitative and quantitative examination of three best-of-class AIM] runtime systemsHCharm-HE, Legion, and Uintah, all of which are in use as part of the Centers. The studies focus on each of the runtimes' programmability, performance, and mutability. Through the experiments and analysis presented, several overarching Predictive Science Academic Alliance Program II (PSAAP-II) Ascl findings emerge. From a performance perspective, AIVT11runtimes show tremendous potential for addressing extreme- scale challenges. Empirical studies show an AM11 runtime can mitigate performance heterogeneity inherent to the machine itself and that Message Passing Interface (MP1) and AM11runtimes perform comparably under balanced con- ditions. From a programmability and mutability perspective however, none of the runtimes in this study are currently ready for use in developing production-ready Sandia ASCIapplications. The report concludes by recommending a co- design path forward, wherein application, programming model, and runtime system developers work together to define requirements and solutions. Such a requirements-driven co-design approach benefits the community as a whole, with widespread community engagement mitigating risk for both application developers developers. and high-performance computing inntime systein

  18. Using Office Simulation Software in Teaching Computer Literacy Using Three Sets of Teaching/Learning Activities

    Directory of Open Access Journals (Sweden)

    Azad Ali

    2016-05-01

    Full Text Available The most common course delivery model is based on teacher (knowledge provider - student (knowledge receiver relationship. The most visible symptom of this situation is over-reliance on textbook’s tutorials. This traditional model of delivery reduces teacher flexibility, causes lack of interest among students, and often makes classes boring. Especially this is visible when teaching Computer Literacy courses. Instead, authors of this paper suggest a new active model which is based on MS Office simulation. The proposed model was discussed within the framework of three activities: guided software simulation, instructor-led activities, and self-directed learning activities. The model proposed in the paper of active teaching based on software simulation was proven as more effective than traditional.

  19. Comment on "Most computational hydrology is not reproducible, so is it really science?" by Christopher Hutton et al.: Let hydrologists learn the latest computer science by working with Research Software Engineers (RSEs) and not reinvent the waterwheel ourselves

    Science.gov (United States)

    Hut, R. W.; van de Giesen, N. C.; Drost, N.

    2017-05-01

    The suggestions by Hutton et al. might not be enough to guarantee reproducible computational hydrology. Archiving software code and research data alone will not be enough. We add to the suggestion of Hutton et al. that hydrologists not only document their (computer) work, but that hydrologists use the latest best practices in designing research software, most notably the use of containers and open interfaces. To make sure hydrologists know of these best practices, we urge close collaboration with Research Software Engineers (RSEs).

  20. Meten aan software: ook in het informatica-onderwijs [Measurement of Software: also in Computer Science Courses

    NARCIS (Netherlands)

    van den Berg, Klaas

    1995-01-01

    Software-metrieken worden gebruikt om de kwaliteit van software te kwantificeren, en wel van het software-produkt, het ontwikkelproces en van de benodigde middelen. Deze metrieken kunnen in het informatica-onderwijs door zowel de docent als de student worden benut. Twee voorbeelden van